Топ-100
 
CerebroSQL

Transfer data - lite (ETL)

Transfer data (lite) is a simple ETL mechanism for transferring data between different DBMS ( Oracle, PostgreSQL, MySQL, MSSQL ).

Creating a Data Migration Flow:

In the main program window DB -> Transfer data (lite)

Show transfer data.jpg

 

On the main manager, it is a collector of the ETL flow log.

To enable data transfer tracing, set the " Trace speed " checkbox before starting the flow on the sheet.

Creating a sheet and starting a data migration flow:

1. Press the " Create new list " button

2. Set up the connection " Connect #1 "

  2.1 Go to the required tab ( Oracle , PostgreSQL (including for PostgreSQL -based DBMS, for example Green Plum ), MySQL (including for MySQL -based DBMS, for example MariaDB ), MSSQL )  

  2.2 Fill in all fields

  2.4 Press the " Connect " button. If the connection is successful, the inscription will change to " Connected " and all fields of the " Connect #1 " block will become inactive.  

3. Set up the connection " Connect #2 "

  3.1 Go to the required tab ( Oracle , PostgreSQL (including for PostgreSQL -based DBMS, for example Green Plum ), MySQL (including for MySQL -based DBMS, for example MariaDB ), MSSQL )  

  3.2 Fill in all fields

  3.4 Press the " Connect " button. If the connection is successful, the inscription will change to " Connected " and all fields of the " Connect #2 " block will become inactive.  

Transfer data (ETL) - Connected database

4. Flow setting:

  4.1 CommitCount - determines the size of the data portion (the number of rows loaded into the PC memory from the source database) after inserting which is committed in the destination database. It is recommended to set this value in the interval  from 300 to 3000 lines. As the value increases, the memory usage of the PC increases.

  4.2 Select a mechanism (driver) that retrieves data from the source database:

   4.2.1  DATASET(SQL) - uses a classic DataSet that retrieves data using a SQL query

   4.2.2  SQLSET(SQL) - a component is used that retrieves data using an SQL query

   4.2.2  SQLSET(TABLE) - the component is used. Input parameter is the name of the table data from which you want to transfer

  4.3 IN Query  - enter the text of the SQL query or the name of the table (if necessary, with the name of the schema in which it is located)  

  4.4 Loop (one line - one request, without the ";" at the end) - when this checkbox is set, the program considers each line in the IN Query field as a separate request to retrieve data. The rows are cycled through and the data transfer is started one by one. IMPORTANT: the string must not end with ";"  

  4.5 Select a mechanism (driver) that retrieves data from the receiver database to obtain metadata about the table structure:

   4.5.1  DATASET(SQL) - uses a classic DataSet that retrieves data using a SQL query

   4.5.2  SQLSET(SQL) - a component is used that retrieves data using an SQL query

   4.5.2  SQLSET(TABLE) - the component is used. The input parameter is the name of the table whose metadata is to be retrieved.

  4.6 OUT Query - enter the text of the SQL query or the name of the table (if necessary, with the name of the schema in which it is located)  

  4.7 Direct - use a specialized data capture mode.

  4.8 Reind - previewing a dataset by a component

  4.9 Optimize - optimize data reading speed

5. Press the " Run data transfer " button

To stop the flow, click the " Stop data transfer " button  

Data transfer speed.

The speed of data transfer depends both on the selected DBMS between which the process of data migration takes place, and on the performance of disks.

 

Test case:

IN connection  

DBMS: Postgres

Host: localhost

SSD drive

OUT connection:

DBMS: MS SQL Server

Host: virtual machine

Table: created from DBA_TABLES view

 

*****2020.09.20 00:46:18*****
Type:SQLSET(TABLE)
SQL: testtb3

Timed: 189.638
RowPerSec: 8623
Read: 1629693
Move: 1629693
error: 0
End: 2020.09.20 00:49:28
***************************

Transfr data - Speed row
Transfer data - Speed mb