Dynamics CRM 2011 批量更新
Posted
技术标签:
【中文标题】Dynamics CRM 2011 批量更新【英文标题】:Dynamics CRM 2011 Bulk Update 【发布时间】:2012-01-21 19:17:31 【问题描述】:运行 Dynamics CRM 2011 推出 3. 需要定期更新数百万客户记录(增量更新)。使用标准更新(一个一个)需要几周时间。此外,我们不想直接接触数据库,因为它可能会在未来破坏一些东西。
Dynamics CRM 2011 webservice/REST API 中是否有我们可以使用的批量更新方法? (WhatWhereHow)
【问题讨论】:
以下链接mscrmtutorials.blogspot.in/2014/07/…中给出的在 MS CRM 上批量创建或更新的清晰示例 你最后做了什么?我们使用kingswaysoft 【参考方案1】:我知道这篇文章已经有 2 年的历史了,但我可以添加它以防其他人阅读它并有类似的需求。
Peter Majeed 的回答恰如其分,因为 CRM 处理一次请求一条记录。没有批量编辑可以按照您的要求工作。如果您需要/想要 Microsoft 支持,我建议您不要直接接触数据库。
如果您正在查看数百万条记录的定期更新,您有几个选择。考虑使用 Scribe 或使用 CRM SDK 开发您自己的自定义导入实用程序或脚本。
Scribe 可能是您的最佳选择,因为它对于数据导入具有成本效益,并且可以让您轻松地从同一个文件更新和插入。
如果您编写自己的基于 .Net/SDK 的实用程序,我建议您将其设为多线程,并以编程方式在内存或磁盘上分解您的输入文件,并让每个线程使用其自己的数据子集 - 即,当然,如果执行顺序不必根据输入文件的内容按时间顺序排列。如果您可以在多个线程上划分和征服输入文件,则可以大大减少整体执行时间。 此外,如果您的公司政策允许您访问其中一个 CRM 服务器,并且您可以将代码直接放在服务器上并从那里执行它 - 您可以消除运行代码的工作站和 CRM Web 服务之间的网络延迟.
最后但同样重要的是,如果大量导入数据来自另一个系统,您可以编写一个 CRM 插件以在您的特定实体的 CRM 中的 Retrieve 和 RetrieveMultiple 消息(事件)上运行,以编程方式检索来自其他系统的所需数据(如果其他系统不可用 - 只需使用 CRM 中的缓存副本),并保持 CRM 实时更新或在“最后一次缓存”的基础上保持最新。这当然需要更多的编码工作,但它可能会消除每隔几周运行一次大型同步作业的需要。
【讨论】:
【参考方案2】:是和不是,大多数情况下都不是。如果我弄错了,有人可以纠正我,在这种情况下,我很乐意编辑/删除我的答案,但 Dynamics CRM 中所做的一切都是一次完成的。它甚至不尝试处理基于集合的插入/更新/删除。因此,除非您直接直接进行数据库操作,否则您将需要数周时间。
webservice does allow for "bulk" inserts/deletes/updates,但我将“批量”放在引号中,因为它所做的只是设置一个异步进程,在该进程中它执行所有相关的数据操作 - 是的 - 一次一个。 SDK 中有一个部分解决了这种数据管理(链接)。要以这种方式更新记录,您必须首先承受选择要更新的所有数据的开销,然后创建一个包含数据的 xml 文件,最后更新数据(请记住:一次一行)。因此,实际上循环遍历您的数据并为每个人发出Update
请求实际上会更有效。
(我会注意到,我们的组织在直接访问数据库以处理 SDK 无法处理的问题方面没有遇到任何令人难忘的问题,我也没有在我的个人互联网阅读中看到任何暗示其他人拥有的内容。)
编辑:
请参阅下面 iFirefly 的 answer,了解解决此问题的其他一些出色方法。
【讨论】:
【参考方案3】:我意识到这是一个老问题,但它在“CRM 批量更新”中出现很高,所以这里需要提到 Update Rollup 12 feature ExecuteMultiple - 它不会解决您的问题(大量),因为 iFirefly 和 Peter指出 CRM 一次只做一件事情。它所做的是将您的所有请求打包到一个信封中,让 CRM 处理每次更新的执行,并在您最终为每条记录发出 Update
请求时减少您的应用程序和服务器之间的往返次数。
【讨论】:
【参考方案4】:这是一个相当古老的问题,但没有人提到在 CRM 201X 中更新/创建大量记录的快速方法(但也是最具挑战性的) - 使用内置导入功能,这完全可以使用 CRM SDK 实现。有一篇关于此的完美 MSDN 文章: https://msdn.microsoft.com/en-us/library/gg328321(v=crm.5).aspx。简而言之,您必须:
1) 构建包含您要导入的数据的 Excel 文件(只需从 CRM 201X 中导出一些数据并检查结构的外观,记住前 3 列是隐藏的)
2)创建导入地图实体(指定您创建的文件)
3) 如有必要,创建列映射
4) 创建 Import 和 ImportFile 实体,提供适当的映射
5) 使用 ParseImportRequest 解析数据
6) 使用 TransformImportRequest 转换数据
7) 使用 ImportRecordsImportRequest 导入数据
这是 CRM 2011 的步骤,现在在 2017 年,我们提供了更多版本,它们之间存在细微差别。检查 MSDN 和 SDK 中提供的示例: https://msdn.microsoft.com/en-us/library/hh547396(v=crm.5).aspx
当然,第 1 点将是最困难的部分,因为您必须构建与 CRM 期望完全对应的 XML 或 docx 文件,但我假设您是从外部应用程序完成的,所以您可以使用一些很棒的.NET 库,这将使事情变得更加简单。
在更新/创建记录方面,我从未见过比标准 CRM 导入更快的方法,即使您使用并行性和批量更新请求也是如此。
如果 MSDN 网站出现问题,我将在此处发布来自上述链接的示例,该示例展示了如何以编程方式将数据导入 CRM:
using System;
using System.ServiceModel;
using System.Collections.Generic;
using System.Linq;
// These namespaces are found in the Microsoft.Xrm.Sdk.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;
using Microsoft.Xrm.Sdk.Client;
using Microsoft.Xrm.Sdk.Messages;
using Microsoft.Xrm.Sdk.Metadata;
// These namespaces are found in the Microsoft.Crm.Sdk.Proxy.dll assembly
// located in the SDK\bin folder of the SDK download.
using Microsoft.Crm.Sdk.Messages;
namespace Microsoft.Crm.Sdk.Samples
/// <summary>
/// This sample shows how to define a complex mapping for importing and then use the
/// Microsoft Dynamics CRM 2011 API to bulk import records with that mapping.
/// </summary>
public class ImportWithCreate
#region Class Level Members
private OrganizationServiceProxy _serviceProxy;
private DateTime _executionDate;
#endregion
/// <summary>
/// This method first connects to the organization service. Afterwards,
/// auditing is enabled on the organization, account entity, and a couple
/// of attributes.
/// </summary>
/// <param name="serverConfig">Contains server connection information.</param>
/// <param name="promptforDelete">When True, the user will be prompted to delete all
/// created entities.</param>
public void Run(ServerConnection.Configuration serverConfig, bool promptforDelete)
using (_serviceProxy = ServerConnection.GetOrganizationProxy(serverConfig))
// This statement is required to enable early bound type support.
_serviceProxy.EnableProxyTypes();
// Log the start time to ensure deletion of records created during execution.
_executionDate = DateTime.Today;
ImportRecords();
DeleteRequiredRecords(promptforDelete);
/// <summary>
/// Imports records to Microsoft Dynamics CRM from the specified .csv file.
/// </summary>
public void ImportRecords()
// Create an import map.
ImportMap importMap = new ImportMap()
Name = "Import Map " + DateTime.Now.Ticks.ToString(),
Source = "Import Accounts.csv",
Description = "Description of data being imported",
EntitiesPerFile =
new OptionSetValue((int)ImportMapEntitiesPerFile.SingleEntityPerFile),
EntityState = EntityState.Created
;
Guid importMapId = _serviceProxy.Create(importMap);
// Create column mappings.
#region Column One Mappings
// Create a column mapping for a 'text' type field.
ColumnMapping colMapping1 = new ColumnMapping()
// Set source properties.
SourceAttributeName = "src_name",
SourceEntityName = "Account_1",
// Set target properties.
TargetAttributeName = "name",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with the data map.
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process)
;
// Create the mapping.
Guid colMappingId1 = _serviceProxy.Create(colMapping1);
#endregion
#region Column Two Mappings
// Create a column mapping for a 'lookup' type field.
ColumnMapping colMapping2 = new ColumnMapping()
// Set source properties.
SourceAttributeName = "src_parent",
SourceEntityName = "Account_1",
// Set target properties.
TargetAttributeName = "parentaccountid",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with the data map.
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process),
;
// Create the mapping.
Guid colMappingId2 = _serviceProxy.Create(colMapping2);
// Because we created a column mapping of type lookup, we need to specify lookup details in a lookupmapping.
// One lookupmapping will be for the parent account, and the other for the current record.
// This lookupmapping is important because without it the current record
// cannot be used as the parent of another record.
// Create a lookup mapping to the parent account.
LookUpMapping parentLookupMapping = new LookUpMapping()
// Relate this mapping with its parent column mapping.
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId2),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)LookUpMappingProcessCode.Process),
// Set the lookup for an account entity by its name attribute.
LookUpEntityName = Account.EntityLogicalName,
LookUpAttributeName = "name",
LookUpSourceCode =
new OptionSetValue((int)LookUpMappingLookUpSourceCode.System)
;
// Create the lookup mapping.
Guid parentLookupMappingId = _serviceProxy.Create(parentLookupMapping);
// Create a lookup on the current record's "src_name" so that this record can
// be used as the parent account for another record being imported.
// Without this lookup, no record using this account as its parent will be imported.
LookUpMapping currentLookUpMapping = new LookUpMapping()
// Relate this lookup with its parent column mapping.
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId2),
// Force this column to be processed.
ProcessCode =
new OptionSetValue((int)LookUpMappingProcessCode.Process),
// Set the lookup for the current record by its src_name attribute.
LookUpAttributeName = "src_name",
LookUpEntityName = "Account_1",
LookUpSourceCode =
new OptionSetValue((int)LookUpMappingLookUpSourceCode.Source)
;
// Create the lookup mapping
Guid currentLookupMappingId = _serviceProxy.Create(currentLookUpMapping);
#endregion
#region Column Three Mappings
// Create a column mapping for a 'picklist' type field
ColumnMapping colMapping3 = new ColumnMapping()
// Set source properties
SourceAttributeName = "src_addresstype",
SourceEntityName = "Account_1",
// Set target properties
TargetAttributeName = "address1_addresstypecode",
TargetEntityName = Account.EntityLogicalName,
// Relate this column mapping with its parent data map
ImportMapId =
new EntityReference(ImportMap.EntityLogicalName, importMapId),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)ColumnMappingProcessCode.Process)
;
// Create the mapping
Guid colMappingId3 = _serviceProxy.Create(colMapping3);
// Because we created a column mapping of type picklist, we need to specify picklist details in a picklistMapping
PickListMapping pickListMapping1 = new PickListMapping()
SourceValue = "bill",
TargetValue = 1,
// Relate this column mapping with its column mapping data map
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId3),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)PickListMappingProcessCode.Process)
;
// Create the mapping
Guid picklistMappingId1 = _serviceProxy.Create(pickListMapping1);
// Need a picklist mapping for every address type code expected
PickListMapping pickListMapping2 = new PickListMapping()
SourceValue = "ship",
TargetValue = 2,
// Relate this column mapping with its column mapping data map
ColumnMappingId =
new EntityReference(ColumnMapping.EntityLogicalName, colMappingId3),
// Force this column to be processed
ProcessCode =
new OptionSetValue((int)PickListMappingProcessCode.Process)
;
// Create the mapping
Guid picklistMappingId2 = _serviceProxy.Create(pickListMapping2);
#endregion
// Create Import
Import import = new Import()
// IsImport is obsolete; use ModeCode to declare Create or Update.
ModeCode = new OptionSetValue((int)ImportModeCode.Create),
Name = "Importing data"
;
Guid importId = _serviceProxy.Create(import);
// Create Import File.
ImportFile importFile = new ImportFile()
Content = BulkImportHelper.ReadCsvFile("Import Accounts.csv"), // Read contents from disk.
Name = "Account record import",
IsFirstRowHeader = true,
ImportMapId = new EntityReference(ImportMap.EntityLogicalName, importMapId),
UseSystemMap = false,
Source = "Import Accounts.csv",
SourceEntityName = "Account_1",
TargetEntityName = Account.EntityLogicalName,
ImportId = new EntityReference(Import.EntityLogicalName, importId),
EnableDuplicateDetection = false,
FieldDelimiterCode =
new OptionSetValue((int)ImportFileFieldDelimiterCode.Comma),
DataDelimiterCode =
new OptionSetValue((int)ImportFileDataDelimiterCode.DoubleQuote),
ProcessCode =
new OptionSetValue((int)ImportFileProcessCode.Process)
;
// Get the current user to set as record owner.
WhoAmIRequest systemUserRequest = new WhoAmIRequest();
WhoAmIResponse systemUserResponse =
(WhoAmIResponse)_serviceProxy.Execute(systemUserRequest);
// Set the owner ID.
importFile.RecordsOwnerId =
new EntityReference(SystemUser.EntityLogicalName, systemUserResponse.UserId);
Guid importFileId = _serviceProxy.Create(importFile);
// Retrieve the header columns used in the import file.
GetHeaderColumnsImportFileRequest headerColumnsRequest = new GetHeaderColumnsImportFileRequest()
ImportFileId = importFileId
;
GetHeaderColumnsImportFileResponse headerColumnsResponse =
(GetHeaderColumnsImportFileResponse)_serviceProxy.Execute(headerColumnsRequest);
// Output the header columns.
int columnNum = 1;
foreach (string headerName in headerColumnsResponse.Columns)
Console.WriteLine("Column[" + columnNum.ToString() + "] = " + headerName);
columnNum++;
// Parse the import file.
ParseImportRequest parseImportRequest = new ParseImportRequest()
ImportId = importId
;
ParseImportResponse parseImportResponse =
(ParseImportResponse)_serviceProxy.Execute(parseImportRequest);
Console.WriteLine("Waiting for Parse async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, parseImportResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
// Retrieve the first two distinct values for column 1 from the parse table.
// NOTE: You must create the parse table first using the ParseImport message.
// The parse table is not accessible after ImportRecordsImportResponse is called.
GetDistinctValuesImportFileRequest distinctValuesRequest = new GetDistinctValuesImportFileRequest()
columnNumber = 1,
ImportFileId = importFileId,
pageNumber = 1,
recordsPerPage = 2,
;
GetDistinctValuesImportFileResponse distinctValuesResponse =
(GetDistinctValuesImportFileResponse)_serviceProxy.Execute(distinctValuesRequest);
// Output the distinct values. In this case: (column 1, row 1) and (column 1, row 2).
int cellNum = 1;
foreach (string cellValue in distinctValuesResponse.Values)
Console.WriteLine("(1, " + cellNum.ToString() + "): " + cellValue);
Console.WriteLine(cellValue);
cellNum++;
// Retrieve data from the parse table.
// NOTE: You must create the parse table first using the ParseImport message.
// The parse table is not accessible after ImportRecordsImportResponse is called.
RetrieveParsedDataImportFileRequest parsedDataRequest = new RetrieveParsedDataImportFileRequest()
ImportFileId = importFileId,
PagingInfo = new PagingInfo()
// Specify the number of entity instances returned per page.
Count = 2,
// Specify the number of pages returned from the query.
PageNumber = 1,
// Specify a total number of entity instances returned.
PagingCookie = "1"
;
RetrieveParsedDataImportFileResponse parsedDataResponse =
(RetrieveParsedDataImportFileResponse)_serviceProxy.Execute(parsedDataRequest);
// Output the first two rows retrieved.
int rowCount = 1;
foreach (string[] rows in parsedDataResponse.Values)
int colCount = 1;
foreach (string column in rows)
Console.WriteLine("(" + rowCount.ToString() + "," + colCount.ToString() + ") = " + column);
colCount++;
rowCount++;
// Transform the import
TransformImportRequest transformImportRequest = new TransformImportRequest()
ImportId = importId
;
TransformImportResponse transformImportResponse =
(TransformImportResponse)_serviceProxy.Execute(transformImportRequest);
Console.WriteLine("Waiting for Transform async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, transformImportResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
// Upload the records.
ImportRecordsImportRequest importRequest = new ImportRecordsImportRequest()
ImportId = importId
;
ImportRecordsImportResponse importResponse =
(ImportRecordsImportResponse)_serviceProxy.Execute(importRequest);
Console.WriteLine("Waiting for ImportRecords async job to complete");
BulkImportHelper.WaitForAsyncJobCompletion(_serviceProxy, importResponse.AsyncOperationId);
BulkImportHelper.ReportErrors(_serviceProxy, importFileId);
/// <summary>
/// Deletes any entity records that were created for this sample.
/// <param name="prompt">Indicates whether to prompt the user
/// to delete the records created in this sample.</param>
/// </summary>
public void DeleteRequiredRecords(bool prompt)
bool toBeDeleted = true;
if (prompt)
// Ask the user if the created entities should be deleted.
Console.Write("\nDo you want these entity records deleted? (y/n) [y]: ");
String answer = Console.ReadLine();
if (answer.StartsWith("y") ||
answer.StartsWith("Y") ||
answer == String.Empty)
toBeDeleted = true;
else
toBeDeleted = false;
if (toBeDeleted)
// Retrieve all account records created in this sample.
QueryExpression query = new QueryExpression()
EntityName = Account.EntityLogicalName,
Criteria = new FilterExpression()
Conditions =
new ConditionExpression("createdon", ConditionOperator.OnOrAfter, _executionDate),
,
ColumnSet = new ColumnSet(false)
;
var accountsCreated = _serviceProxy.RetrieveMultiple(query).Entities;
// Delete all records created in this sample.
foreach (var account in accountsCreated)
_serviceProxy.Delete(Account.EntityLogicalName, account.Id);
Console.WriteLine("Entity record(s) have been deleted.");
#region Main method
/// <summary>
/// Standard Main() method used by most SDK samples.
/// </summary>
/// <param name="args"></param>
static public void Main(string[] args)
try
// Obtain the target organization's web address and client logon
// credentials from the user.
ServerConnection serverConnect = new ServerConnection();
ServerConnection.Configuration config = serverConnect.GetServerConfiguration();
var app = new ImportWithCreate();
app.Run(config, true);
catch (FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault> ex)
Console.WriteLine("The application terminated with an error.");
Console.WriteLine("Timestamp: 0", ex.Detail.Timestamp);
Console.WriteLine("Code: 0", ex.Detail.ErrorCode);
Console.WriteLine("Message: 0", ex.Detail.Message);
Console.WriteLine("Trace: 0", ex.Detail.TraceText);
Console.WriteLine("Inner Fault: 0",
null == ex.Detail.InnerFault ? "No Inner Fault" : "Has Inner Fault");
catch (System.TimeoutException ex)
Console.WriteLine("The application terminated with an error.");
Console.WriteLine("Message: 0", ex.Message);
Console.WriteLine("Stack Trace: 0", ex.StackTrace);
Console.WriteLine("Inner Fault: 0",
null == ex.InnerException.Message ? "No Inner Fault" : ex.InnerException.Message);
catch (System.Exception ex)
Console.WriteLine("The application terminated with an error.");
Console.WriteLine(ex.Message);
// Display the details of the inner exception.
if (ex.InnerException != null)
Console.WriteLine(ex.InnerException.Message);
FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault> fe = ex.InnerException
as FaultException<Microsoft.Xrm.Sdk.OrganizationServiceFault>;
if (fe != null)
Console.WriteLine("Timestamp: 0", fe.Detail.Timestamp);
Console.WriteLine("Code: 0", fe.Detail.ErrorCode);
Console.WriteLine("Message: 0", fe.Detail.Message);
Console.WriteLine("Trace: 0", fe.Detail.TraceText);
Console.WriteLine("Inner Fault: 0",
null == fe.Detail.InnerFault ? "No Inner Fault" : "Has Inner Fault");
// Additional exceptions to catch: SecurityTokenValidationException, ExpiredSecurityTokenException,
// SecurityAccessDeniedException, MessageSecurityException, and SecurityNegotiationException.
finally
Console.WriteLine("Press <Enter> to exit.");
Console.ReadLine();
#endregion Main method
【讨论】:
【参考方案5】:不确定这将如何处理数百万条记录,但您可以选择您的记录,然后单击功能区中的“编辑”按钮。这将打开“编辑多条记录”对话框。您所做的任何更改都将应用于您的所有记录。
【讨论】:
所有更新都是单独的,并且会定期发生。客户记录的增量更新 = 人员地址、电话号码等的更改。【参考方案6】:BulkUpdate API 非常适合我;它比一次更新一条记录快 10 倍。以下是执行批量更新的 sn-p:
public override ExecuteMultipleResponse BulkUpdate(List<Entity> entities)
ExecuteMultipleRequest request = new ExecuteMultipleRequest()
Settings = new ExecuteMultipleSettings()
ContinueOnError = true,
ReturnResponses = true
,
Requests = new OrganizationRequestCollection()
;
for (int i = 0; i < entities.Count; i++)
request.Requests.Add(new UpdateRequest() Target = entities[i] );
return (ExecuteMultipleResponse) ServiceContext.Execute(request);
【讨论】:
【参考方案7】:我为 Dynamics CRM 2011 处理了一个非常大的数据迁移项目。我们需要在一个周末加载大约 300 万条记录。我最终构建了一个控制台应用程序(单线程)并在多台机器上运行多个实例。每个控制台应用程序都有一个 ID(1、2 等),并负责根据与应用程序 ID 匹配的唯一 SQL WHERE 子句加载数据段。
您可以对更新做同样的事情。每个实例都可以查询要更新的记录子集,并可以通过 SDK 执行更新。由于我们在一个周末加载了数百万条记录,我认为您可以在短短几个小时内执行数百万次更新(如果相对较小的话)。
【讨论】:
【参考方案8】:Microsoft PFE 动态 CRM 团队写道 使用并行化的新Another CRM SDK library 批量执行请求以确保线程安全。
您可以尝试:并行执行请求 我很想知道它是否可以工作并扩展到数百万条记录。
【讨论】:
【参考方案9】:CRM 没有实现更新批量数据的方法;有 3 种方法可以提高批量更新操作性能,但在内部它们无法改变 CRM 更新记录一一记录的事实。 基本上这些想法是:
减少与 CRM 服务器通信所浪费的时间 使用并行性同时执行多项操作 确保更新过程不会触发任何工作流/插件。否则你可能永远看不到进程的结束......提高批量操作性能的3种方法:
-
在 RollUp 12 之后有一个 ExecuteMultipleRequest 功能,它允许您一次发送多达 1000 个请求。这意味着您可以从向 CRM Web 服务发送 1000 个请求中节省一些时间,但是,这些请求是一个接一个地处理的。因此,如果您的 CRM 服务器配置良好,那么这种方法很可能不会有太大帮助。
您可以使用 OrganizationServiceContext 实例进行批量更新。 OrganizationServiceContext 实现了工作单元模式,因此您可以进行多次更新并将这些操作一次调用传输到服务器。与ExecuteMultipleRequest相比,它没有请求数量的限制,但如果在更新过程中遇到失败,它会回滚所有的更改。
使用多线程或多任务。无论哪种方式都可以提高速度,但它们可能会产生一些连接失败或 SQL 错误,因此您需要在代码中添加一些重试逻辑。
【讨论】:
【参考方案10】:我的一位客户遇到了完全相同的问题。他通过创建自定义 ETL 并进行并行攻击两个前端来解决这个问题。整个事情都是用 C# 制作的。现在,KingswaySoft 或 Scribe 可以做到这一点。
【讨论】:
以上是关于Dynamics CRM 2011 批量更新的主要内容,如果未能解决你的问题,请参考以下文章
Dynamics CRM 修改Excel 最大导出记录限制及 最大上传文件限制
Dynamics CRM 2015/2016/365 Web API:批处理任务
Dynamics CRM 2015/2016/365 Web API:批处理任务
Dynamics CRM 2011 - 如何更改现有 CRM 实例的活动目录服务器?