Free Trial Hands-on Guide

The objective of this step by step guide is to provide you with the necessary foundation to get you started on Exasol along with some easy hands-on exercises. These exercises allow you to start using some of the key functionalities of Exasol.

Additionally, the Data Science and UDFs Examples section provides you with UDF tutorials based on real-world examples and how to use them for Data Science within Exasol.

Step 1: Get Access to Free Trial System

The free trial system is hosted on ExaCloud, and you can access it by signing up here. You will shortly receive an email confirmation along with the credentials details to log into the Exasol demo system. Since the demo system is hosted on ExaCloud, you do not have to perform any additional setup or hardware configurations.

An empty database schema is created for you, where you can create and execute tables, views, functions, and scripts. Additionally, the database also includes some preloaded datasets.

Step 2: Download and Connect SQL Clients

You can connect Exasol to various SQL clients. Some of the tools we recommend are: 

Once you have installed the SQL client of your choice, you can connect it to the free trial system by using the credentials provided in the email.

Since an empty database schema is created for you in the free trial system, run the CURRENT_USER statement to view the name of the current user, and run the OPEN SCHEMA statement to see the schema that is created for your username. For example: 

SELECT CURRENT_USER;
OPEN SCHEMA <YOUR USERNAME>;

<YOUR USERNAME> will be the result of the query SELECT CURRENT_USER

Other useful resources include: 

Step 3: ANSI SQL Examples

Now that you have your SQL client installed and connected to the Exasol database, let's create tables, insert data, and execute some queries on the preloaded data. This section provides you with some of the ANSI SQL statements and examples you can run on your Exasol database.

For an in-depth information about the syntax and semantics of SQL statements, data types, functions, and other SQL language elements supported by the Exasol, refer to the SQL Statements section.

CREATE TABLE

This statement creates a new table in your schema.

CREATE TABLE t (a VARCHAR(20),
           b DECIMAL(24,4) NOT NULL,
           c DECIMAL DEFAULT 122,
           d DOUBLE,
           e TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
           f BOOL);

For more information, see CREATE TABLE.

DESCRIBE and DESCRIBE FULL

This statement prints the column information for a given table or view.

DESCRIBE t;
DESCRIBE FULL T;

For more information, see DESC[RIBE].

INSERT

This statement inserts data into the newly created table. For example: 

INSERTINTO
    t VALUES
    (
        'hello Exasol',
        2.4444,
        '',
        10.10,
        DEFAULT,
        0
    ),
    (
        'Hallo Exasol!!',
        10.4444,
        '',
        20.20,
        DEFAULT,
        0
    );

As a best practice, Exasol does not recommend single-row inserts. When importing large amounts of data, the Exasol bulk loader should be used (for example, importing from a local CSV file)

For more information, see INSERT.

SELECT

This statement can be used retrieve data from tables. For example: 

  • To display inserted data:
SELECT * FROM t;
  • Basic count of rows in table: 
SELECT COUNT(a) FROM t; 
  • Basic sum of rows in table: 
SELECT SUM(b) FROM t; 
  • Display current date:
SELECT CURRENT_DATE
  • To convert a date, timestamp or interval into a string.
SELECT TO_CHAR(CURRENT_DATE, 'DAY');

For more information, see SELECT.

SELECT INTO

This statement copies data from one table into a new table.

SELECT * INTO TABLE NATION_1gb FROM TPCH_1GB.NATION;

To confirm the table was created: 

SELECT * FROM NATION_1GB;

The TPCH_x schemas are schemas that are used in the TPC-H benchmark tests and have been made available for you to experiment with.

For more information, see SELECT INTO.

CREATE VIEW

Use this statement to create a view.

CREATE VIEW my_view as select * from NATION_1GB;

You can test the view executing the below statement: 

SELECT * FROM my_view;

For more information, see CREATE VIEW.

TRUNCATE and DROP TABLE

You can clear the content of a table without removing the table by using the TRUNCATE statement.

TRUNCATE TABLE t;

You can delete a table by using the DROP TABLE statement.

DROP TABLE t CASCADE;

For more information, see TRUNCATE and DROP TABLE.

DROP VIEW

Use the DROP VIEW statement to delete a view.

DROP VIEW my_view CASCADE;

For more information, see DROP VIEW.

Other useful resources include: 

Step 4: Import / Export Data

You can import data into Exasol either by using the ETL process or by importing from a local CSV files, using UDFs and so on. The Loading Data section in our documentation provides you with more information.

For the purpose of this hands-on exercise, lets do the following: 

Import from Local CSV

Follow the below steps to import data from local CSV file:

  1. Download the items_table CSV file.
  2. Add this file to the folder of your choice.
  3. Open your schema by using
  4. OPEN SCHEMA <schema name>;

  5. Create a table for importing this data. For example:
  6. CREATE TABLE
        test_import
        (
            ITEM_ID DECIMAL(18,0),
            ITEM VARCHAR(25),
            COST DECIMAL(18,2)
        );
  7. Next, run the following import statement:
  8. IMPORT INTO test_import FROM LOCAL CSV FILE
    'C:\Temp\items_table.csv' --location of file on local computer
    ENCODING = 'UTF-8'
    COLUMN SEPARATOR =','
    SKIP = 1;
  9. Check the content of the table by executing this statement: 
  10. SELECT * FROM test_import;

Import CSV from S3 Bucket

You can follow the below steps to import data from a CSV from an S3 bucket. A connection to s3 - "s3_aws_import" is already created for you.

  1. Open your schema by using
  2. SELECT * FROM test_import;

  3. Create a destination table called RETAIL_SALES_IMPORT as shown below: 
  4. CREATE TABLE
        "RETAIL_SALES_IMPORT"
        (
            "SALES_ID" INTEGER,
            "SALES_DATE" DATE,
            "SALES_TIMESTAMP" TIMESTAMP,
            "PRICE" DECIMAL(9,2),
            "MONEY_GIVEN" DECIMAL(9,2),
            "RETURNED_CHANGE" DECIMAL(9,2),
            "LOYALTY_ID" INTEGER,
            "MARKET_ID" SMALLINT NOT NULL,
            "TERMINAL_ID" SMALLINT,
            "EMPLOYEE_ID" SMALLINT,
            "TERMINAL_DAILY_SALES_NR" SMALLINT);
  5. Next, execute the following command which identifies the table you want to import, the connection of the S3 bucket, the location and name of the CSV file.
  6. IMPORT INTO "RETAIL_SALES_IMPORT" from CSV at s3_aws_import file 'getting_started/retail_sales.csv';
  7. Test the import by executing the following commands: 
  8. SELECT COUNT(*) FROM "RETAIL_SALES_IMPORT";
    SELECT * FROM "RETAIL_SALES_IMPORT";

Export into Local CSV

You can export a table as a CSV file to your local system by executing the following command: 

EXPORT
(SELECT * FROM test_import) INTO LOCAL CSV FILE 'C:\Users\<YOUR USERNAME>\Desktop\test1.csv'  WITH COLUMN NAMES --location of file on local computer
 REPLACE;
EXPORT table test_import INTO LOCAL CSV FILE 'C:\Users\<YOUR USERNAME>\Desktop\test2.csv'; --location of file on local computer

Import from Oracle Using Migration Scripts

You can import data from Oracle by following the steps below. For more information on how to connect Oracle to Exasol, refer to the Loading Data from Oracle section.

  1. Open your schema by using
  2. OPEN SCHEMA <schema name>;

  3. Check for existing Oracle connection by listing them.
  4. -- List connection
    SELECT
        CONNECTION_NAME
    FROM
        "SYS"."EXA_ALL_CONNECTIONS"
    WHERE
        connection_name = 'OCI_ORACLE';

    In the free trial system an Oracle connection is already created for you.

  5. To test this connection, run this command:
  6. SELECT * FROM
    (
    IMPORT FROM <conn_type> AT <name_of_connection>
    STATEMENT 'SELECT 42 FROM DUAL'
    );

    To test an AWS Oracle connection, run this command: 

    IMPORT FROM ORA at OCI_ORACLE statement 'select 42 from dual';

  7. Next, we need to create the migration script for importing data from Oracle to Exasol. Download the ORACLE_TO_EXASOL_PDEMO script file and run this script in your SQL client to create the database migration script.
  8. After you have created the database migration script (ORACLE_TO_EXASOL_PDEMO), run it using the below command: 
  9. EXECUTE SCRIPT ORACLE_TO_EXASOL_PDEMO('OCI_ORACLE', true, 'RETAIL','ARTICLE');

  10. The ORACLE_TO_EXASOL_PDEMO script generates the SQL statements necessary to import data from Oracle. Following are SQL statements returned from step 5, which you need to run to import data from Oracle to Exasol: 
  11. The ORACLE_TO_EXASOL_PDEMO migration script has been modified for the purpose of this exercise. To view the original file, see our GitHub repository.

    -- session parameter values are being taken from Oracle system wide database_parameters and converted. However these should be confirmed before use.
    -- Oracle DB's NLS_CHARACTERSET is set to : AL32UTF8
    -- ALTER SESSION SET NLS_DATE_LANGUAGE='AMERICAN';
    -- ALTER SESSION SET NLS_DATE_FORMAT='DD-MON-YY';
    -- ALTER SESSION SET NLS_TIMESTAMP_FORMAT='DD-MON-YY HH.MI.SS.FF6 AM';

    Create the Article table: 

    CREATE OR REPLACE TABLE
        "ARTICLE"
        (
            "ARTICLE_ID" DECIMAL(9,0) NOT NULL ,
            "DESCRIPTION" VARCHAR(100),
            "EAN" DECIMAL(18,0),
            "BASE_SALES_PRICE" DECIMAL(9,2),
            "PURCHASE_PRICE" DECIMAL(9,2),
            "PRODUCT_GROUP" DECIMAL(9,0),
            "PRODUCT_CLASS" DECIMAL(9,0),
            "QUANTITY_UNIT" VARCHAR(100),
            "TMP_OLD_NR" DECIMAL(9,0),
            "PRODUCT_GROUP_DESC" VARCHAR(100),
            "DISTRIBUTION_COST" DECIMAL(9,2)
        );

    IMPORT INTO ARTICLE table:

    import into "ARTICLE"( "ARTICLE_ID",
        "DESCRIPTION",
        "EAN",
        "BASE_SALES_PRICE",
        "PURCHASE_PRICE",
        "PRODUCT_GROUP",
        "PRODUCT_CLASS",
        "QUANTITY_UNIT",
        "TMP_OLD_NR",
        "PRODUCT_GROUP_DESC",
        "DISTRIBUTION_COST") from ORA at OCI_ORACLE statement
    'select
    "ARTICLE_ID",
        "DESCRIPTION",
        "EAN",
        "BASE_SALES_PRICE",
        "PURCHASE_PRICE",
        "PRODUCT_GROUP",
        "PRODUCT_CLASS",
        "QUANTITY_UNIT",
        "TMP_OLD_NR",
        "PRODUCT_GROUP_DESC",
        "DISTRIBUTION_COST" from "RETAIL"."ARTICLE"';

    Confirm table is populated with data: 

    SELECT COUNT(*) FROM "ARTICLE";

Virtual Schemas

Virtual schemas allow you to access other data sources. Virtual schema is a type of read-only link to an external source and contains virtual tables that look like regular tables except the data is not stored locally.

A virtual schema is already created for you in your free trial system. This is hands-on exercise, let's open the virtual schema already created and access data from an Oracle database. Run the following command to do so: 

open schema ORACLE_VIRTUAL_SCHEMA;

Next, you can review the tables in the virtual schema with the following SQL command: 

select * from cat;

Additionally, you review further details of the table by using the DESCRIBE statement as shown below:

DESCRIBE SALES_POSITIONS;
 
DESCRIBE ARTICLE;

Join Tables in Virtual Schema

The following example shows you how to join a virtual Oracle table with a local Exasol table that we imported in the last exercise:

SELECT
    a.*,
    sp.*
FROM
    "YOUR SCHEMA"."ARTICLE" a
JOIN
    oracle_virtual_schema.SALES_POSITIONS sp
ON
    a.article_id=sp.article_id;

Import Data from Virtual Tables

Apart from being able to join local tables with virtual tables, you can also import data from the virtual table. The following example shows you how to import data into virtual tables: 

CREATE TABLE
    "YOUR SCHEMA"."ORACLE_VS_SALES_POSITIONS" AS
SELECT
    *
FROM
    ORACLE_VIRTUAL_SCHEMA.SALES_POSITIONS limit 100;

To view the table create: 

SELECT * FROM "YOUR SCHEMA"."ORACLE_VS_SALES_POSITIONS";

Step 4: CREATE FUNCTIONS

Exasol provides a variety of built-in functions such as scalar, aggregate, and analytical functions. In addition to these, Exasol also provides the ability to create user-defined function using the CREATE FUNCTION statement. Functions are written in Lua and can be particularly helpful in migrating functions from Oracle or other databases as well as adding to the existing set of functions available within Exasol.

The following are examples for CREATE FUNCTION

OPEN SCHEMA <YOUR USERNAME>;
 
 
--/
CREATE OR REPLACE function percentage(fraction DECIMAL, entirety DECIMAL)
RETURN VARCHAR(10)
IS
 res DECIMAL;
BEGIN
 IF entirety = 0
  THEN res := NULL;
 ELSE
  res := (100*fraction)/entirety;
 END IF;
 RETURN res || ' %';
END percentage;
/
CREATE TABLE my_table (fraction DECIMAL, entirety DECIMAL);
 
INSERT into my_table values(1,1),(1,2),(1,3),(1,5);
 
SELECT fraction, entirety, percentage(fraction,entirety) AS PERCENTAGE from my_table;

For additional information on CREATE FUNCTION, refer to the CREATE FUNCTION statement.

Step 5: Create LUA Scripts

Scripting programming allows you to execute multiple SQL commands sequentially and handle errors during the execution. You can run control jobs within the database (for example, complex loading processes) and ease up repeating jobs by parameterized scripts, like the creation of a user including the password and privileges.

Here is a basic LUA script that shows how to create a LUA script as well as return an output.

--/
CREATE OR REPLACE SCRIPT create_small_sample(src) AS
--src = Name of Source
new_name = src .. '_SAMPLE'
--Copy
suc, res = pquery([[CREATE TABLE ::s_new AS SELECT * FROM ::s LIMIT 10]], {s_new = new_name, s=src})
-- Error Handling
if not suc then
        error('Something went wrong: '..res.error_message)
end
/
EXECUTE SCRIPT  create_small_sample('NATION_1GB');
select * from NATION_1GB_SAMPLE;

For additional information on scripting programming, refer to the Scripting section.

Next Steps

View the Data Science and UDFs Examples section for information demonstrating the capabilities of Exasol's UDFs along with tutorials based on real world datasets.