Dev GuideAPI Reference
Dev GuideAPI ReferenceUser GuideLegal TermsGitHubDev CommunityOptimizely AcademySubmit a ticketLog In
Dev Guide

Integration Processors

Describes the different types of processors available within Optimizely Configured Commerce.

The lifecycle of an integration job consists of three steps:

  1. Preprocessor
  2. Integration processor
  3. Postprocessor

Preprocessors

Preprocessors get called on the Website side by the Integration Web Service method IWebServiceHandler.GetIntegrationJob and performs preprocessing for the Integration Job before the Integration Job is returned from the Integration Web Service to the WindowsIntegrationBroker.

NameUsage/Notes
None

This is used when there is no need for a preprocessor.

GenericSubmit

Populates the initial dataset that gets passed to the integration process from the object in each step. A step parameter with the objects natural key needs to be specified along with the key value. For example, if you are submitting a Product, a step parameter with the name of "ERPNumber" will need to be added with the value equaling the ERPNumber of the product you are submitting. This will then pass that product to the integration processor.

LookupInformation

This creates the following tables in IntegrationJob.InitialDataSet:

  1. ProductLookup table
  2. CustomerLookup table
  3. MiscLookup table.

These tables can be used by the integration processor to look up GUIDs for products, customers, and integration user profile Id respectively.

This is a rarely-used processor since the Integration Framework will normally take the "natural keys" from the source system and look up the related entities on the server side. There is a process for Pricing, however, that requires embedding of the actual product and customer Ids into resulting dataset.

Exercise caution when using this processor since it sends down all customers and products and is only available for the duration of the current job being run on the WIS.

SqlQuery

Creates the SQL statement from the following fields:

  1. Select
  2. From
  3. Where
  4. Parameterized Where

The SQL statement then gets stored in the JobDefinitionStep.IntegrationQuery to be run by the integration processor.

If there are step parameters then the Parameterized Where clause will be used in place of the Where clause.

Additionally, custom Preprocessors can be created by implementing the IJobPreprocessor interface. This gives developers the flexibility of populating the IntegrationJob object to meet the specific needs of the job.

Integration processors

Integration processors get called by the WindowsIntegrationBroker.ProcessJob method once for each IntegrationJobDefinitionStep of the IntegrationJob to perform the reading from or writing to the external system being integrated with and returns a DataSet. Each IntegrationJobDefinitionStep returns a DataSet and these DataSets are merged together to produce the ResultsDataSet that is posted back up to the Configured Commerce Website Integration Web Service.

NameStep ParametersUsage/Notes
None

This is used when there is no need to perform any logic on the integration server. This is typically used on jobs where only a postprocessor is needed such as rebuilding indexes or sitemaps. The external WIS processor is used to simply get the job to be processed.

CleanupDataSetsRetentionDaysDeletes all archived datasets on the integration server that are older than the "RetentionDays" parameter.
FileUpload

RemoteServerUrl

SourceDirectory

SourceMask

DestinationDirectory

DeleteSourceFilesOnCopy

This can be used to copy files from the integration server to the web.

The SourceMask can be used with pipe delimiters to select files with differing masks and even include subdirectories.

The destination directory is always in context to the UserFiles directory - do not use a prefix of slash.

DeleteSourceFilesOnCopy have values of True or False.

The default location for the api call can be overridden by specifying a "RemoteServerUrl". This should rarely be needed.

FlatFile

This can be used when reading data from a flat file source such as CSV, XLS or XLSX. The Columns (replaces Select) clause should define ALL of the columns to be read - the system will error out if the file does not match the columns. The titles, if provided, are irrelevant - the system simply matches the values on their ordinal position.

The "Where" clause can be used to filter the data returned.

OdbcQuery

This connects to a remote Odbc Datasource and gets and returns data via a windows service located external to a web server requesting the data.

The query is constructed from the "Select", "From", and "Where" or "Parameterized Where" fields.

If a step parameter is present and the parameter is passed into the integration job, then the parameterized where clause is invoked rather than the standard where clause. This is used when, for example, a single customer is being requested from the management console to be refreshed - it sends in the value of the current customer into the job and the parameterized where clause is invoked in place of the standard where clause.

Parameters are referenced in the parameterized where clause by using the parameter name preceded by @

OleDbQuery

This connects to a remote Ole Datasource and gets and returns data via a windows service located external to a web server requesting the data.

The query is constructed the same as for an OdbcQuery.

SqlQuery

This connects to a SQL server and gets and returns data via a windows service located external to a web server requesting the data.

The query is constructed the same as for an OdbcQuery.

Additionally, custom integration processors can be created by implementing the IIntegrationProcessor interface. This gives developers the flexability of creating and returning the DataSet to meet the specific needs of the job.

There will typically be custom processors for different ERPs or specific endpoints that need to gather data. Examples would include a CustomerSubmit, OrderSubmit and PricingRefresh. These are normally packaged up and delivered for specific integration endpoints.

Postprocessors

Postprocessors get run on the Configured Commerce Website by the Integration Web Service method IWebServiceHandler.ProcessIntegrationJobResults and gets the ResultsDataSet passed to it to process that DataSet to for example update the Products for a Product Refresh or to set the CustomerOrder.Status and CustomerOrder.ERPOrderNumber for a CustomerOrder Submit.

NameParameters

Usage/Notes

None

This is used when there is no need to perform any logic on the <> web server after the integration processor finishes. An example of this might be an Order Submit.

BuildSitemap

This processor rebuilds the website's sitemap. When this is the postprocessor, typically there is no pre or integration processor needed.

CleanupDataSetsRetentionDays

Deletes all archived datasets on the web server's /DataSet folder that are older than the "RetentionDays" parameter.

CreateCategoryAttributeTypes

Executes SQL that will first delete then auto create CategoryAttributeTypes by looking at products, which categories they are assigned to and what product filter values exist.

CurrencyRefresh

Get the latest exchange rates for all currencies in the system. This uses the application setting called "CurrencyConverter" to select the converter to use. <> currently supports WebserviceX.

Additionally, there is an application setting called "CurrencyRatePurgeDaysBack" that is used to automatically purge out conversion entries over a certain age.

ExecuteStoredProcedureStoredProcedureName

Executes a stored procedure in the <> database based on the job definition parameter "StoredProcedureName".

FieldMapField Mapping

Maps the data from the dataset return from the integration processor to the <> database through objects. See FieldMap section below for more details.

JobStatusReport

LookBackHours

ClientName

Environment

IgnoreJobs

Creates an email of the statuses of the jobs that have been run based on "LookBackHours". The system defaults to 24 hours if this is not provided.

The environment is typically the actual URL and is used in constructing the job status report.

IgnoreJobs is a comma-separated list of jobs that should be specifically excluded from the report.

ProcessSubmitResponse

Handles all post processing after a Customer/Order submit. Typically this involves syncing the <> data to what was submitted to the ERP.

It updates the customer or order object respectively with the data returned from the ERP system.

When the submits are returned, the CustomerOrder is updated with the ERPOrderNumber and sets the status to "Processing". The Customer Submit updates with the ERPCustomerNumber and ERPCustomerSequence on the customer record.

ProductFeedBazzarVoice

ProductFeedCertona

ProductFeedMonetate

BaseUrl

These are predefined integrations to some third-party partners systems. All of the fields and formatting are handled entirely within the processor.

The "BaseUrl" needs to be specified. This is the base URL for the site (that is www.mysite.com).

RebuildProductSearchIndexRebuilds the product search index. This will always perform a full re-index.
RefreshDynamicCategoriesRefreshSearch

This regenerates all of the dynamic categories. It will rebuild the search index if the "RefreshSearch" parameter is set to true.

ReturnDataSet

This sets the IntegrationJob.ResultDataSet on the IntegrationJob to the DataSet returned from the integration processor. It will be used normally for RealTime tasks that just want to query the ERP.

Additionally, custom postprocessors can be created by implementing the IPostprocessor interface. This gives developers the flexability needed to meet their needs.

Field mapping

The Field Mapping tab is a powerful tool that can be leveraged to map source data to destination data through objects. This can be done by specifying the field type, from property, to property, overwrite, and dataset key values for each element to be mapped.

If you are using delta datasets, the system uses a hash key that comprises the select clause and any step parameters with their values for comparison between datasets. The dataset key is vital in this process to ensure that records are written uniquely into the dataset itself and they must resolve to a unique key within the dataset.

Some of the field mappings have to be entered manually for varying reasons and will not automatically show up in the drop-down box. Examples could include child collections that are supported but do not populate in the drop-down correctly - this is a defect in the system that will be resolved with the new Management Console rewrite in 4.2.

"From" and "To" are relative to the type of job you are running. For a refresh job "From" represents that data in the ERP or other back end system and "To" the Configured Commerce object model. For a submit job, this is reversed.

There are various field types that are supported

Name

Usage/Notes

Field

Populates the "From Property" to the "To Property"

StaticValue

Populates a static value for all records to the "To Property"

ApplicationSetting

Populates the "To Property" from the application setting specified in the "From Property"

Lookup

Looks up a parent object based on a natural key. The object to lookup is the "To Property". The natural key is stored in the value specified in the "From Object".

There are a few "multi-level" lookups that are explicitly supported in the Lookup function.

The first example is mapping to the StyleTraitValue object. The "to property" will be StyleTrait as a lookup and the "from property" will have to use the data fields representing StyleClass and StyleTrait (comma separated). For example, if the style class is Shirts and there is a trait of Size, then the data feed for the value needs to include the class, the trait and the value. This convention is used to map the natural key up the chain from the style trait to its style class.

This tiered lookup is also available for Attribute/AttributeType/AttributeValue.

ChildCollection

Specifies a relationship to a child collection that is specified in the "To Property". The natural key is held in the "From Property". For example, a product can be a child collection of a category.

Content

Creates content in the "To Property" (ContentManager, DocumentManager, or SubscriptionShipVia). The "From Property" contains the field that stores the content, and optionally the language code, persona name, and device type fields provided in the data stream. They must be entered in that exact order with commas as the separator. The system will use default value if a data value is not supplied.