The Salesforce Destination Component is an SSIS Data Flow Component for loading data into a Salesforce object.
In this section we will show you how to set up a Salesforce Destination component.
- Before you begin, configure a Salesforce connection manager.
- Configure a source component that will provide the data to be written to Salesforce.
- Ensure that you are on the Data Flow canvas.
- In the SSIS Toolbox, locate the Salesforce Destination component and drag it onto the Data Flow canvas.
- Connect the blue arrow from the source component or the data flow transformation that you want to immediately precede the Salesforce Destination component.
- Double-click on the component on the canvas.
- Once the component editor opens, select the connection manager you configured earlier from the Connection drop-down list.
- Choose the desired Action (Create, Update, Delete, or Upsert).
- Choose the Object you will be working with.
- The options will vary depending on the Action you selected. Check the options you desire. Descriptions of the options can be found in the Parameters section of the Salesforce Destination documentation.
- For Mode, either select Regular or Bulk.
- Click OK to close the component editor.
Use the parameters below to configure the component.
Select an existing Salesforce connection manager.
Related Topics: Salesforce Connection Manager
Select a destination object action. This parameter has the options listed in the following table.
Action Description Create Create a new record in the destination object. Update Update an existing record in the destination object. You must specify the appropriate object record identifier for the update to work. Delete Delete an existing record from the destination object. You must specify the appropriate object record identifier for the delete to work. Upsert 1.4 SR-3 Update and insert a record in the destination object. Selecting this value displays the dynamic parameter ExternalId.After changing the action, you must use the Refresh command in the 'Column Mappings' tab to reload the destination object metadata.
Specify the number of rows to be sent as a batch. The maximum for regular mode is 200. The maximum for bulk-load mode is 10,000.
Select bulk-load concurrency mode. This parameter has the options listed in the following table.
Value Description Parallel Process batches in parallel mode (default). Serial Process batches in serial mode. Processing in parallel can cause database contention. When this is severe, the job may fail. If you're experiencing this issue, submit the job with serial concurrency mode. This guarantees that batches are processed one at a time. Note that using this option may significantly increase the processing time for a job.
Specify the destination Salesforce object where the data is to be loaded.
Specify the field for the external identifiers used in the upsert action.
Select the destination object processing mode. This parameter has the options listed in the following table.
Value Description Regular Process the data in regular mode. BulkData Process the data in bulk-load data mode. BulkBinary Process the data in bulk-load binary mode.
Select variable to store bulk-load job identifier. Optional.
Specifies how NULL values are handled. This parameter has the options listed in the following table.
Options Description True The NULL values are ignored and not sent for processing. False The NULL values are sent for processing.
Specify how to handle rows with errors.
Contains the unique identifier of the added, updated or deleted record.
Specify the relationship name for updating the foreign key lookup with an external identifier.
Specify the referenced object for updating the foreign key lookup with an external identifier.
Specify the external identifier field for updating the foreign key lookup with an external identifier.
Specify optional bulk job information. For further review Salesforce documentation. Optional.
- Where can I find the documentation for the Salesforce Destination?
- How to add timezone to the datetime column?
- Missing objects or columns when using Salesforce Destination Component
- How to upload files to Salesforce
- Error Message: unable to connect to remote server
- Fixed: Id field was not included in the list of available mapping columns when using Upsert action.
- Fixed: IgnoreNullValue parameter didn't work properly when bulk-mode was used.
- Fixed: Component failed with error "type must be specified for polymorphic foreign key field: Who" (Thank you, Mustafa).
- New: A new parameter BulkJobInfo for optional bulk job information.
- Fixed: Component assumed input date/time columns without time zone information are local time and converted them to UTC. This broke backwards compatibility for existing packages (Thank you, Bharat).
- Fixed: Component will now convert input date/time columns to UTC time.
- New: A new parameter IgnoreNullValue.
- New: A new parameter Concurrency with two options: Parallel and Serial.
- New: Component now permits use of identifier lookup fields for the Upsert action.
- Fixed: Component failed with error "The remote name could not be resolved" when using bulk-load mode with test service instances.
- New: A new parameter JobIdVariable to store bulk job identifier in variable.
- Fixed: Component failed with "Column data type is not supported by PipelineBuffer class." error when processing input columns of type DT_DBDATE (Thank you, Ellen).
- New: Component can now update foreign key lookups with external identifier.
- New: Component now supports setting object fields to NULL.
- Fixed: Component failed with "... is not valid for the type xsd:double" when used in international environment (Thank you, Blazej).
- New: Component now provides newly created records identifier (Thank you, Sam).
- New: Component now supports Upsert action - update and insert of records (Thank you, Brian).
- New: Component now supports update and delete of records.
- Fixed: Component failed under SQL 2008 when ErrorDescription column was not used.
- New: Introduced component.
Ready to give it a try?
COZYROC SSIS+ Components Suite is free for testing in your development environment.