The Salesforce Destination Component is an SSIS Data Flow Component for loading data into a Salesforce object.
Use the parameters below to configure the component.
Select an existing Salesforce connection manager.
Related Topics: Salesforce Connection Manager
Select a destination object action. This parameter has the options listed in the following table.
Action Description Create Create a new record in the destination object. Update Update an existing record in the destination object. You must specify the appropriate object record identifier for the update to work. Delete Delete an existing record from the destination object. You must specify the appropriate object record identifier for the delete to work. Upsert 1.4 SR-3 Update and insert a record in the destination object. Selecting this value displays the dynamic parameter ExternalId.After changing the action, you must use the Refresh command in the 'Column Mappings' tab to reload the destination object metadata.
Specify the number of rows to be sent as a batch. The maximum for regular mode is 200. The maximum for bulk-load mode is 10,000.
Select bulk-load concurrency mode. This parameter has the options listed in the following table.
Value Description Parallel Process batches in parallel mode (default). Serial Process batches in serial mode. Processing in parallel can cause database contention. When this is severe, the job may fail. If you're experiencing this issue, submit the job with serial concurrency mode. This guarantees that batches are processed one at a time. Note that using this option may significantly increase the processing time for a job.
Specify the destination Salesforce object where the data is to be loaded.
Specify the field for the external identifiers used in the upsert action.
Select the destination object processing mode. This parameter has the options listed in the following table.
Value Description Regular Process the data in regular mode. BulkData Process the data in bulk-load data mode. BulkBinary Process the data in bulk-load binary mode.
Select variable to store bulk-load job identifier. Optional.
Specifies how NULL values are handled. This parameter has the options listed in the following table.
Options Description True The NULL values are ignored and not sent for processing. False The NULL values are sent for processing.
Specify how to handle rows with errors.
Contains the unique identifier of the added, updated or deleted record.
Specify the relationship name for updating the foreign key lookup with an external identifier.
Specify the referenced object for updating the foreign key lookup with an external identifier.
Specify the external identifier field for updating the foreign key lookup with an external identifier.
Specify optional bulk job information. For further review Salesforce documentation. Optional.
The destination object is available in the data flow properties list. Follow these steps to set up an expression:
- Right-click on the data flow canvas and select the Properties menu.
- Scroll down and find the property named like [Salesforce Destination].[DestinationObject]. This is the property containing the destination table.
- Scroll down and find the Expressions property. Set up an expression to modify the statement dynamically.
The component's Error Output is used for providing both error information and the processed record identifier. Check the ErrorCode column:
ErrorCode Description -1 Not an error record. >0 Error record. Check the ErrorDescription column for more details.
You can set up a standard Conditional Split component to filter non-error records.
Uploaded files are located in the Attachment object. Use the standard "Import Column" transformation to import file content into the Body field. Check also the demo video above.
- Fixed: IgnoreNullValue parameter didn't work properly when bulk-mode was used.
- Fixed: Component failed with error "type must be specified for polymorphic foreign key field: Who" (Thank you, Mustafa).
- New: A new parameter BulkJobInfo for optional bulk job information.
- Fixed: Component assumed input date/time columns without time zone information are local time and converted them to UTC. This broke backwards compatibility for existing packages (Thank you, Bharat).
- Fixed: Component will now convert input date/time columns to UTC time.
- New: A new parameter IgnoreNullValue.
- New: A new parameter Concurrency with two options: Parallel and Serial.
- New: Component now permits use of identifier lookup fields for the Upsert action.
- Fixed: Component failed with error "The remote name could not be resolved" when using bulk-load mode with test service instances.
- New: A new parameter JobIdVariable to store bulk job identifier in variable.
- Fixed: Component failed with "Column data type is not supported by PipelineBuffer class." error when processing input columns of type DT_DBDATE (Thank you, Ellen).
- New: Component can now update foreign key lookups with external identifier.
- New: Component now supports setting object fields to NULL.
- Fixed: Component failed with "... is not valid for the type xsd:double" when used in international environment (Thank you, Blazej).
- New: Component now provides newly created records identifier (Thank you, Sam).
- New: Component now supports Upsert action - update and insert of records (Thank you, Brian).
- New: Component now supports update and delete of records.
- Fixed: Component failed under SQL 2008 when ErrorDescription column was not used.
- New: Introduced component.
Ready to give it a try?
COZYROC SSIS+ Components Suite is free for testing in your development environment.