by nagstler on 3/2/24, 8:38 AM with 13 comments
by tootie on 3/2/24, 4:44 PM
by gregw2 on 3/2/24, 11:06 PM
When you push data into Salesforce, and Salesforce (API) returns error messages indicating some row in the target can't be modified because it is busy, is your tool able to detect which rows couldn't get loaded and retry those rows N times (hopefully with some exponential backoff delay or perhaps a fixed delay between retries), and then if those retries fail, report back up permanent failures of which rows/IDs didn't make it to Salesforce? (I haven't run across a tool that does this but this was my pain point.)
by t0mas88 on 3/2/24, 10:19 PM
Have you considered adding source connectors for S3 based data lakes? For example Parquet files or Delta Lake? Maybe via AWS Athena to make it similar to the Red Shift connector?
by zicon35 on 3/3/24, 1:38 PM
by volderette on 3/2/24, 12:49 PM