Skip to main content

xSuite Interface Windows Prism 5.x – Online Help

Task "DBTransfer"

The following properties apply to the "DbTransfer" task. This task is a generically configurable option for transferring data records from one supported database system to another database system.

Property

Description

Schedule[].Task[].InputDb.ConnectionString*(§)

Schedule[].Task[].InputDb.Password(§)

Schedule[].Task[].InputDb.Select*

In the .ConnectionString property, specify the connection string to the source database from which data is transferred to the xSuite Helix Cloud.

Specify the SELECT query in the .Select property. For both properties, use the same syntax as for the connection of external data sources used in configuring a scenario (see External Data Sources). The same data sources are supported for this task as for the task of connecting of external data sources.

The password required in the connection string can be defined separately in .Password. The variable %Password% can then be used to include the password in the string.

Schedule[].Task[].OutputDb.ConnectionString*(§)

Schedule[].Task[].OutputDb.Password(§)

Schedule[].Task[].OutputDb.Insert*

Schedule[].Task[].OutputDb.Delete

The .ConnectionString and .Password properties define the connection data to the target database to which the data is written. Microsoft SQL Server, MongoDB and systems that can be accessed via OLEDB are supported as external databases for write access.

To write the data, specify an INSERT statement in the .Insert property, using the SQL syntax or MongoDB syntax supported by the target system. For SQL databases, the values to be output should be included in the statement as typed variables in the form @fieldname. The Name parameter of the fields defined in Schedule[].Task[].Field[] must be used. As an additional alternative which applies to MongoDB databases only, you can insert field contents into the statement using a text replacement with %FieldName%. When doing so, ensure that the replacement text has the exact syntax required for the database system.

For the .Delete property, you have the option of defining a DELETE statement in native syntax to empty the target table before transferring new data records.

Schedule[].Task[].Field[]

Definition of the fields that are transferred from the source data records to the target data records

Make field specifications in the form of multiple parameters:

  1. Name(*): internal variable name

  2. InputName: name of the field in the source record

  3. Macro: field macro expression for generating or formatting field content

Base the name of the source field on the source system. In a relational database, for instance, the name should correspond to the field name returned by the SELECT query. However, do not give calculated or aggregated fields a dedicated name. If you have not given these fields an explicit name per AS clause, the fields will be given the pseudonym Column {field number}.

You can also define fields that do not originate from the input data and therefore do not have an InputName. These fields are artificially added to the data record and can obtain the field value from a macro expression. If the field value is not obtained from a macro expression, the value will remain empty.

The Macro parameter prepares the content of the value of an read field before it is output. This parameter can also generate new values. As a rule, the new values generated are based on values from other fields. The macro expression must have the syntax of a field macro. All fields of the current data record can be accessed in the macro expression via the @Fieldname variable. Currently, the insertion of variables via text replacement is not supported.

Schedule[].Task[].LogRecords

Schedule[].Task[].LogMacros

Boolean values determining whether the contents of read and prepared data records and the individual steps of macro execution are logged

In the standard system, information of this type is logged at the Debug log level. Compared to normal document processing, a transfer of database records tends to involve a significantly larger number of records. To better handle all these log entries and lengthy logs, for a database record transfer you can activate logging separately. However, we advise using this option only as an exception (e.g., for problem analysis).

Default value: TRUE

Schedule[].Task[].MaxErrorRecords

Numerical value indicating the number of incorrectly processed data records after which the whole task is aborted

In order to override isolated errors and transfer as many data records as possible without aborting, you can define a threshold value for how many errors are to be tolerated. There is no rollback for records that have already been successfully transferred before an abort.

Default value: 0 (immediate abort after an error)

Example

The following excerpt from a task configuration serves as an example of a simple data transfer from a Microsoft SQL Server database table to another database table:

{
   "Type": "DbTransfer",
   "InputDb": {
      …
      "Select": "SELECT [InField1], [InField2], [InField3] FROM [InTable]"
   },
   "OutputDb": {
      …
      "Delete": "DELETE FROM [OutTable]",
      "Insert": "INSERT INTO [OutTable]([OutField1], [OutField2], [OutField3])
                 VALUES (@FIELD1, @FIELD2, @FIELD3)"
   },
   "Field": [
      "FIELD1, InField1",
      "FIELD2, InField2",
      "FIELD3, InField3"
   ]
}

This example shows how the internal field names are assigned to the input fields from the source table via the .Fields list. The assignment to the output fields in the target table, on the other hand, takes place directly within the INSERT statement.

When transferring the source field values to the target field values, always ensure that the data types are compatible. If you are not using additional preparation with macro functions, the source values will be transferred to the target fields unchanged. However, at this point and for the output back into the target data type, the read values are already implicitly converted into an equivalent .NET data type. All .NET data types are supported. However, direct transfer of values can still be problematic for some special database system data types if these types do not have a direct equivalent in .NET or if the source database and the target database do not use the same types.

If macro functions are used to generate and format the values, the scope of usable data types is further restricted to the types supported by the internal macro interpreter. The various forms of database data types can usually be implicitly converted into the data types of the macro interpreter. Text data types and numeric data types can usually be implicitly converted into the "Text" and "Number" types of the macro interpreter, for example, and can therefore be used in macros. Exceptions to the rule are data types that contain binary data. As an alternative to the internal macro functions, source values and target values can also be converted directly within the SELECT and INSERT statements using corresponding CAST functions.

When using macro functions, it should also be noted that the internal macro interpreter does not support NULL values. Instead, the macro interpreter uses empty default values depending on the data type (e.g., an empty string ("") for an empty text value). A macro function therefore does not return NULL as a result. The handling of NULL values as input parameters depends on the specific implementation of the macro function. If, in the example above, the source field InField1 were a text field and the target field OutField1 had a numeric database data type, a macro function for type-appropriate conversion could be ParseAsNumber(@FIELD1). However, if, for example, the source value were NULL, this function would return the value 0 for an empty numerical value. To explicitly pass the original value NULL to the target field, an additional check of the type If(!IsNull(@FIELD09), ParseAsNumber(@FIELD09)) is always required. Only if the source value is not NULL will the conversion be carried out; otherwise, the value will remain unchanged.