cub-e.net

just coding...

Use Try and Finally on Disposable Resources

Why

Using a try/finally block ensures that resources are disposed even if an exception occurs. Not disposing resources properly leads to performance degradation over time. 

When

This is important guideline when working with disposable resources such as 

  • Database connections 
  • File handles 
  • Message queue handles 
  • Text reader, writer 
  • Binary reader, writer 
  • Crypto stream 
  • Symmetric, Asymmetric and Hash algorithms. 
  • Windows Identity, Windows Impersonation Context  
  • Timer, Wait Handle in case of threading 
  • Impersonation Context 
  • XML Reader 
  • XML Writer

How

The following code fragment demonstrates disposing resources in a finally block..

SqlConnection conn = new SqlConnection(...)

try

{

   conn.Open();

   ...

}

finally

{

   if(null != conn)

     conn.close();

}

If developing in C# you can use the 'using' keyword which will automatically dispose resources after their usage.

using(SqlConnection conn = new SqlConnection(...))

{

   conn.Open();   

   ....

}


Problem Example

A database connection is opened and used to access data. 

Unfortunately, if there is an exception other than SqlException or if the exception handling code itself throws an exception the database connection won't be closed. This failure to close database connections could cause the application to run out of database connection impacting application performance.

try

{

   conn.Open();

   // do something with the connection

   // some more processing

   // close the connection

   if(conn != null)

     conn.Close();

}

catch(SqlException ex)

{

   // do exception handling

   // close the connection

   if(conn != null)

     conn.Close();

}

Solution Example

A database connection is opened and used to access data. The finally block will be executed whether there is exception or not, ensuring that the database connection is closed. 

SqlConnection conn = new SqlConnection(...)

try

{

   conn.Open();

   // do something with the connection

   // some more processing

}

catch(SqlException ex)

{

   // do exception handling 

  

}

finally

{

   // close the connection

   if(conn != null)

     conn.Close();

}

 

 

Additional Resources

Related Items

Explicitly Call Dispose or Close on Resources You Open

If you use objects that implement the IDisposable interface, make sure you call the Dispose method of the object or the Close method if one is provided. Database connection and files are examples of shared resources that should be explicitly closed. 


Why

Failing to call Close or Dispose prolongs the life of the object in memory long after the client stops using it. This defers the cleanup and will result in inefficient memory usage. 

When

This guideline should be used when working with disposable resources such as:

  • Database connections 
  • File handles 
  • Message queue handles 
  • Text reader, writer 
  • Binary reader, writer 
  • Crypto stream 
  • Symmetric, Asymmetric and Hash algorithms. 
  • Windows Identity, Windows Impersonation Context  
  • Timer, Wait Handle in case of threading 
  • Impersonation Context 
  • XML Reader 
  • XML Writer 

How

When working with disposable resources call Dispose method of the object or Close method, if provided. The Close method internally calls the dispose method. The finally clause of the try/finally block is a good place to ensure that the Close or Dispose method of the object is called.

The following Visual Basic® .NET code fragment demonstrates disposing resources in finally block.

Try

  conn.Open()

…Finally

  If Not(conn Is Nothing) Then

    conn.Close()

  End If

End Try

In Visual C#®, you can wrap resources that should be disposed, by using a using block. When the using block completes, Dispose is called on the object listed in the brackets on the using statement. The following code fragment shows how you can wrap resources that should be disposed by using a using block.

using (SqlConnection conn = new SqlConnection(connString))

{

  conn.Open();

  . . .

} // Dispose is automatically called on the connection object conn here.


Problem Example

A .NET application opens a database connection. Unfortunately, the database connection is never disposed. Since it is not explicitly disposed the connection will stay active until the application terminates.  This unnecessarily increases the application's memory usage. Database connections are a limited resource, failure to dispose could result in the usage of all available database connections.

SqlConnection conn = new SqlConnection(...)

try

{

   conn.Open();

   // do some processing with the connection

}

catch(SqlException ex)

{

   // do exception handling

}


Solution Example

A .NET application opens a database connection. Calling Close method on the connection object in the finally clause ensures that the database connection is disposed after it has been used.  

SqlConnection conn = new SqlConnection(...)

try

{

   conn.Open();

   // do some processing with the connection   

}

catch(SqlException ex)\

{

   // do exception handling    

}

finally

{

   // close the connection

   if(conn!=null)

     conn.Close();

}

 


Additional Resources

Related Items

Use Try and Finally on Disposable Resources

CRM Saturday Oslo 2017

Please join us on the next CRM Saturday event in Oslo (Norway) on 26th August. My session is "D365 New Features & Deprecations"

Full details and registration can be found on the CRM Saturday website: http://mwns.co/cso2017

CRM Saturday is a Free CRM Technical & Strategy Event Organised by the Microsoft Dynamics Community MVP's For Dynamics 365 Professionals, Technical Consultants & Developers. Learn & share new Skills whilst Promoting the CRM Manifesto, helping organisations overcome the challenges of implementing a Successful CRM Strategy with Microsoft Dynamics.

This is a whole day event with a lot of speakers from all around the world. It is an advertising and recruitment free event, and we are setting up both a technical and a business/strategy track so there will be interesting session for everyone.


Managing Complex CRM Scenarios by Using the SolutionPackager Tool


In enterprise scenarios, it is a common requirement to be able to deliver common or separate CRM modules to separate business units according to an enterprise-level or separate roll-out timeline.

The typical challenges of this scenario result from parallel development tracks and frequent changes to enterprise projects, which require the greatest degree of flexibility in every aspect:

§ Sharing common code/customization base

§ Managing solution layering and dependencies

§ Managing localization

§ Managing changes in layering (moving items to common or back, customizing common elements locally)

§ Versioning common and local solutions

§ Parallel development on multiple solutions, multiple solution-layer setups, and multiple versions of a solution

§ Central repository of every/any solution

§ Test and deploy all possible or required combination of solution layers

§ Managing solution branch and merge

§ Regression testing changes and impacts across solutions and branches

§ Common, single versioning of all solution elements

Development and Version Control

You need streamlined and common techniques of development and repository management for complex enterprise deliveries. The standard features of Microsoft Team Foundation Server serve as the basis of larger application delivery management scenarios:


§ ALM

§ Debugging and diagnostics

§ Testing tools

§ Architecture and modeling

§ Database development

§ Integrated development environment

§ Build management and automation

§ Version control

§ Test case management

§ Work item tracking

§ Reports and dashboards

§ Lab management


When working with multiple development teams developing multiple solutions and versions in parallel on an enterprise delivery, the following base process is minimally recommended:


§ Developers should have isolated VMs to work on either hosted locally or centrally to share infrastructure between team members. Every developer work-item needs to be committed to the TFS environment (integration environment).

§ The development lead is in charge of making sure that the developments (customizations, configurations, and custom source files) can integrate in the integration environment without any side effects.

§ The TFS source control should be considered as the central repository for all solution items:

§ All CRM customizations (Entities, forms, views, workflows, templates, web resources, etc.)

§ Base data

§ Source codes and projects for custom components

o Plug-ins, custom activities

o Custom web applications

o Custom .Net components

o Custom services and other external components

o Web resource files


§ External components and references (SDK, Enterprise Library, etc.)

§ Unit test source codes for above components

§ Automated test sources for the entire application

§ Setup package for the entire application

§ Deployment scripts for the packages

§ Build definition for the entire application

The customer requirements and the work-item break-down should be stored in TFS work-item management. This way the development check-ins can be connected to the work-items, and the built-in development tracking and reporting functions of TFS can be leveraged. You will also be able to use the change-impact analysis functions of TFS for risk analysis and selecting the regression test scenarios for a specific change-set.

In multi-feature and multi-team scenarios you usually need multiple development and test environments, probably using different versions of the application. The TFS build automation and lab management functions provide the necessary toolset for managing daily build and deployment steps. You find more details about the topic in the Build and Test sections.

Version Control Process

The CRM customizations are out-of-the-box bound to a single CRM solution zip file, which requires special techniques to manage as part of a source control system.

The SolutionPackager tool (see Use the SolutionPackager Tool to Compress and Extract a Solution File) is designed to unpack (split) a solution .zip file into a broad set of files within folders by entity name. The breakout into files enables that modifications to different components will not cause changes to occur in the same files. For example, modifying an entity Form and a View would result in changes in different files (hence no conflicts to resolve). Pack will take the files from disk and glue them back together to construct a managed or unmanaged .zip file.

The development process for Solution Packages with the SolutionPackager tool is illustrated in the following graphic.



 


The developer works on his/her daily environment to work on the CRM development task allocated. Before starting a work item the developer (or the build master) gets the latest version of customization tree from TFS and uses the SolutionPackager tool to create the solution zip file. The developer (or the build master) deploys the latest solution package on the development environment. The same solution can also be deployed to the test CRM environment.


At the end of a day or after finishing a work item, the developer exports the solution zip file from the development environment. Executing the SolutionPackager tool on the zip file will generate the separate customization elements of the entire solution. The customization files need to be checked out and checked-in individually only when changed.

The Visual Studio project may be created manually. The SolutionPackager tool will automatically create the folder structure below that (the unpack script need to be executed in the solution project folder).

Manual Steps

The current version of SolutionPackager tool provides the basic functionality to convert the packaged customization zip file to a version controllable folder structure. The following manual steps are still needed by the developers or test leads to be able to maintain the source tree:

1.   Check-in check-out of specific customization elements, which will be affected by a work-item change.

2.   Execute the pack/unpack operation manually and selecting the changed customization elements (Note: this can be integrated into the clean and deploy process of the Developer Toolkit).

3.   Taking care of the managed and unmanaged version of components at check-in time.

4.   Managing the assembly versioning (FQ strong names) of plug-ins used by a CRM solution.

5.   Managing RibbonDiffXML (merging changes).

The following techniques may be used to further support the version control process:

§ Removing assembly strong names from customization files to be able to track only the real changes.

§ Some schema parts are reordered randomly during solution import-export; these elements may be ordered alphabetically to more convenient source control tracking.

Localization Challenges

Enterprise CRM projects usually face the requirement of supporting multiple locales. Although CRM has OOB support for localization, it is always a challenge to provide full solution localization because different components required different localization strategies, and the explosion of components may be required.

Localization of some specific CRM components is currently not supported or not applicable. For example:


§ Workflows

§ Dialogs

§ Templates


CRM Developer Toolkit

§ Roles & FLS profiles

§ Plug-in assemblies


The CRM Developer Toolkit may be used for daily developer work for creating streamlined plug-ins, generating codes, solutions, and accessing CRM customization pages. The source code should be checked in directly under TFS. The applied customizations can be exported, unpacked from the CRM organization, and the specific changed elements can be checked in manually.

Note: The Developer Toolkit can support the process of packing and unpacking by turning on the ability to export a CRM Solution as part of the local deployment process.

Sitemap, Ribbon Editor

The CRM customization system only supports editing the sitemap and ribbon definition manually as xml files, after a manual export and followed by an import operation. The SiteMap and Ribbon Editor tools provide a graphical UI for executing these types of customizations.

These tools can be also combined with the unpacked structure:

§ Sitemap.xml is unpacked into a separate xml file and can be directly edited by SiteMap Editor.

§ The Ribbon Editor supports editing directly on a CRM environment, so developers need to edit on the server and then unpack the solution zip file and check-in the changes.

Daily Developer Work

The developers should use unmanaged solutions and shared configuration environments during development time. One developer usually works on one solution (either CRM or other Visual Studio solution), but multiple developers can work on the same solution in parallel.

 

The developer environments should be typically updated on a per-day basis using the last successful build. You should consider the best environment allocation and setup strategy for your team, depending on the actual project and team setup (see Deployment and Test).

The developers typically use standard CRM customization UI for executing their daily work. They create CRM customization elements (entities, screens, ribbons, views, dialogs, plug-in registrations, etc.). They may use Visual Studio for creating custom developed plug-ins, web applications, or other components, and may also use other external tools such as resource editors, designers, sitemap, ribbon, and metadata editor for CRM.

The editing of external components and resources can be typically executed offline on the developer machine after executing the TFS check-out operation (VS built-in editors usually manages check-out automatically).

Just before the check-in, they need to export the solution from the CRM environment and execute a special batch file per solution, which will unpack the customization file.

Developers need to take care of the actual check-in. They need to be aware of the executed changes and which schema members it affected. Only the changed elements need to be checked-in, and there are also special cases when the check-in elements need also to be merged back or compared to the previous version (for example, ribbonDiffXml).

Before the actual check-in operation, the developer must get the latest sources and create a local build, including a solution package, to deploy and test in the development environment; this is the recommended practice for quality check-ins. This process may be also supported by the Gated Check-in or Continuous Integration feature of TFS.

The daily development process is illustrated in the following figure.



 

Build Automation

The build process can be automated using the standard TFS build agent. The build agent is a Windows service that executes the processor-intensive and disk-intensive work associated with tasks such as getting files from and checking files into the version control system, provisioning the workspace for the builds, compiling the source code, and running tests.

 


A typical build process consists of the following steps:

1.   Getting the build definition (You may have multiple parallel projects/builds.)

2.   Update build number (solution versioning)

3.   Prepare build context

a.   Create drop location

b.   Get build agent

c.Get build directory and initialize workspace

4.   Get source files

5.   Compile sources and create setup packages

6.   Run unit tests

7.   Process test results

8.   Drop build and notify

The build process may be extended further by automated deployment and testing (for more details see the Build, Test and Deployment sections):

§ Rollback test environments

§ Deploying setup packages to test environments

§ Executing automated test steps/schedule manual test scenarios






Offline Build Process

The offline build process is designed as an independent build flow enabled to execute the building of a whole CRM solution package without a running instance of Dynamics CRM. The offline build process leverages the static schema of the solution elements and customization files (see Development and Version Control).

 




The build process can be easily extended to use the SolutionPackager tool. The CRM Solution Builder project template and target file samples can be used to integrate the pack operation into the daily build process (see CRM Solution Builder).

During the build process, the Visual Studio standard projects will be compiled, built, and dropped as defined in the build and project definitions.

Building the CRM customization project will trigger the pack operation will the up-to-date customization source tree (see solution tree structure above). All customizations will be merged together as a single customization.xml and solution.xml file. The previously built solution resources and assemblies can be included directly in the solution package.

Note: ILMerge operation may be required to make DB-registered assemblies work with referenced assemblies.

Automating Deployment

The deployment automation of packed CRM solution can be executed identically to standard CRM solution deployment.

On enterprise projects the following scenarios can cover the typical deployment requirements:

§ Automated deployment of development environments, test environments, and integration environments

§ Offline setup packages and scripts to enable deployment to acceptance and production environment as a sandbox

§ Rollback environments to a baseline (both CRM components and the CRM database)

o May already contain existing solutions and possible CRM Update Rollups

§ Deploying CRM solutions

§ Deploying CRM base data and custom security roles

§ Additional configuration of the organization, BU, and user settings using the CRM API

Note: For further details on CRM automated deployment process see the Test and Deployment sections.

Managed/Unmanaged Solutions

The version control differences between managed and unmanaged solutions are already described in Managing Version Requirements section. The SolutionPackager tool will process and unpack both types of solutions. The unpack process will create the differencing elements with “managed” postfix for easier version control management.


The general recommendations for using managed or unmanaged solutions:

§ Use unmanaged solutions for development.

§ Use managed solutions for all downstream environments.

§ Use as few solutions as possible for easier management.

§ Customize existing managed solutions as little as possible.

§ Execute integration tests using managed solutions frequently to test and address possible collisions.

§ Test managed solution combinations for every possible or supported scenario.

Do not use specialized update operation requests in Microsoft Dynamics CRM

In releases prior to Microsoft Dynamics CRM 2015 Update 1, the use specialized messages were required in order to update certain entity attributes. However, with the modern release of CRM, the SDK has been simplified and now supports updating specialized fields using the Update operation. CRM Web API does not support the specialized messages. Once the CRM 2011 services are removed, currently labeled as deprecated, there will be no support for performing these operations.


The following message requests are deprecated as of Microsoft Dynamics CRM 2015 Update 1:

  • AssignRequest 
  • SetStateRequest 
  • SetParentSystemUserRequest 
  • SetParentTeamRequest 
  • SetParentBusinessUnitRequest 
  • SetBusinessEquipmentRequest 
  • SetBusinessSystemUserRequest

In order to update the corresponding specialized attributes, use the UpdateRequest message. Messages can be published from a plug-in, a workflow activity, JavaScript through the services or from an integrating application.

These specialized messages will continue to work with the 2011 endpoint. However, the recommendation is to use the UpdateRequest or Update method when possible to set these attributes. The Update message simplifies the SDK API and makes it easier to code standard data integration tools used with Dynamics CRM. In addition, it is easier to code and register a plug-in to execute for a single Update message instead of multiple specialized messages. The AttributeMetadata.IsValidForUpdate property for the above listed attributes has been changed to true in this release to enable this capability.

You can continue to use these specialized messages of the 2011 endpoint in your code. However, the Web API that eventually replaces the 2011 endpoint supports only the Update message for these types of operations. If you want to get a head start on changing your code to align with the Web API, you can now do so. See Web API Preview for more information.


Impact of this change on plug-ins


When update requests are processed that include both owner fields plus other standard fields for business owned entities, plug-ins registered for the Update message in pipeline stage 20 and/or stage 40 execute once for all non-owner fields, and then once for the owner fields. Examples of owner fields would be businessunit and manager (for a SystemUser entity). Examples of business owned entities include SystemUserBusinessUnitEquipment, and Team.

When update requests are processed that include both state/status fields plus other standard fields, plug-ins registered for the Update message in pipeline stage 20 and/or stage 40 execute once for all non-state/status fields, and then once for the state/status fields.

In order for plug-in code to receive the full data changes of the update, you must register the plug-in in stage 10 and then store relevant information in SharedVariables in the plug-in context for later plug-ins (in the pipeline) to consume.


Impact of this change on workflows

When update requests are processed that include both owner fields plus other standard fields, workflows registered for the Update message execute once for all non-owner fields, and then once for the owner fields. Workflows registered for the Assign message by users continue to be triggered by updates to owner fields.

When update requests are processed that include both state/status fields plus other standard fields, workflows registered for the Update message execute once for all non-state/status fields, and then once for the state/status fields. Workflows registered for the Change Status step continue to be triggered by updates to state/status fields.

SDK ve NuGet kullanarak Dynamics 365 CRM'e baglanma

Yontem 1: SDK Kullanmak

 

https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/download-tools-nuget adresinden alacaginiz asagidaki powershell scripti ile sdk icerisindeki araclari 4 klasor halinde bilgisayariniza indirebilirsiniz.

 

$sourceNugetExe = "https://dist.nuget.org/win-x86-commandline/latest/nuget.exe"

$targetNugetExe = ".\nuget.exe"

Remove-Item .\Tools -Force -Recurse -ErrorAction Ignore

Invoke-WebRequest $sourceNugetExe -OutFile $targetNugetExe

Set-Alias nuget $targetNugetExe -Scope Global -Verbose

 

##

##Download Plugin Registration Tool

##

./nuget install Microsoft.CrmSdk.XrmTooling.PluginRegistrationTool -O .\Tools

md .\Tools\PluginRegistration

$prtFolder = Get-ChildItem ./Tools | Where-Object {$_.Name -match 'Microsoft.CrmSdk.XrmTooling.PluginRegistrationTool.'}

move .\Tools\$prtFolder\tools\*.* .\Tools\PluginRegistration

Remove-Item .\Tools\$prtFolder -Force -Recurse

 

##

##Download CoreTools

##

./nuget install  Microsoft.CrmSdk.CoreTools -O .\Tools

md .\Tools\CoreTools

$coreToolsFolder = Get-ChildItem ./Tools | Where-Object {$_.Name -match 'Microsoft.CrmSdk.CoreTools.'}

move .\Tools\$coreToolsFolder\content\bin\coretools\*.* .\Tools\CoreTools

Remove-Item .\Tools\$coreToolsFolder -Force -Recurse

 

##

##Download Configuration Migration

##

./nuget install  Microsoft.CrmSdk.XrmTooling.ConfigurationMigration.Wpf -O .\Tools

md .\Tools\ConfigurationMigration

$configMigFolder = Get-ChildItem ./Tools | Where-Object {$_.Name -match 'Microsoft.CrmSdk.XrmTooling.ConfigurationMigration.Wpf.'}

move .\Tools\$configMigFolder\tools\*.* .\Tools\ConfigurationMigration

Remove-Item .\Tools\$configMigFolder -Force -Recurse

 

##

##Download Package Deployer

##

./nuget install  Microsoft.CrmSdk.XrmTooling.PackageDeployment.WPF -O .\Tools

md .\Tools\PackageDeployment

$pdFolder = Get-ChildItem ./Tools | Where-Object {$_.Name -match 'Microsoft.CrmSdk.XrmTooling.PackageDeployment.Wpf.'}

move .\Tools\$pdFolder\tools\*.* .\Tools\PackageDeployment

Remove-Item .\Tools\$pdFolder -Force -Recurse

 

##

##Remove NuGet.exe

##

Remove-Item nuget.exe

 

 

Yukaridaki script ile Dynamics 365 SDK'yi indirdiyseniz CoreTools altinda su 3 dll'I bulabilirsiniz:

 

Microsoft.Xrm.Sdk.dll

Microsoft.Crm.Sdk.Proxy.dll

Microsoft.Xrm.Tooling.Connector.dll

 

Bu dll'leri uygulamanize referans olarak ekledikten sonra asagidaki using parametlerini islem yapmak istediginiz sinifa eklemeniz gerekmekte.

 

using Microsoft.Xrm.Sdk;

using Microsoft.Xrm.Sdk.Query;

using Microsoft.Xrm.Tooling.Connector;

 

Yontem 2: NuGet Kullanmak

NuGet paketlerini kullanmak uygulama gelistirme ve derleme asama bircok problemi ortadan kaldirmakta. Ozellikle ekip halinde kod gelistiriyor ve TFS-VSO-Github gibi ortamlarda kodusakliyorsaniz her kod gelistiricinin kendi bilgisayarinda farkli noktalarda tuttugu dll referanslari daha sonra basiniza dert olacaktir. Iste bunu onlemenin en pratik yolu NuGet uzerinden referanslari almak ve kullanmaktir.

 

Nuget package konsolundan ya da Visual Studio arabirimi uzerinden asagidaki 2 paketi bulun ve projenize ekleyin.

 

  1. Microsoft.CrmSdk.CoreAssemblies
  2. Microsoft.CrmSdk.XrmTooling.CoreAssembly

 

Diger paketler otomatik olarak kurulacaktir.

 

Her iki yontem icin ortak adimlar

Asagidaki connection stringlerden hangisi size uygunsa ilgili connection string'I alabilirsiniz:

<!– Online using Office 365 –>

<!–<add name=”Server=CRM Online”

connectionString=”Url=https://myserver.crm.dynamics.com; Username=user@myserver.com; Password=password; authtype=Office365″/>–>

<!– On-premises with provided user credentials –>

<!– <add name=”Server=myserver, organization=AdventureWorksCycle, user=administrator”

connectionString=”Url=http://myserver/AdventureWorksCycle; Domain=mydomain; Username=administrator; Password=password; authtype=AD”/> –>

<!– On-premises using Windows integrated security –>

<!– <add name=”Server=myserver, organization=AdventureWorksCycle”

connectionString=”Url=http://myserver/AdventureWorksCycle; authtype=AD”/> –>

<!– On-Premises (IFD) with claims –>

<!–<add name=”Server=myserver.com, organization=contoso, user=someone@myserver.com”

connectionString=”Url=https://contoso.myserver.com/contoso; Username=someone@myserver.com; Password=password; authtype=IFD”/>–>

 

CRMServiceClient sinifina ilgili connection string'I veriyoruz ve elde ettigimiz servisi kullaniyoruz.

 

CrmServiceClient conn = new CrmServiceClient(ConfigurationManager.ConnectionStrings[“Server=CRM Online”].ToString());

var _orgService = (IOrganizationService)conn.OrganizationServiceProxy;

 

Artik fetchXML ile veri cekebilir ya da Insert/Update/Delete/Execute islemlerini _orgService ile icra edebilirsiniz.

Avoid selecting all columns via Microsoft Dynamics CRM query APIs

For optimal performance, you should only select the minimum amount of data needed by your application when querying CRM data.  Queries that include a defined ColumnSet where the ColumnSet.AllColumns property is 'true' instruct the CRM data access platform to issue a SELECT * on all physical data included in the query plan.  This scenario should be avoided whenever possible.  
 

Violation Examples

ColumnSet.AllColumns setter method call

    var columns = new ColumnSet();

    columns.AllColumns = true;

 

    var query = new QueryExpression("account");

    query.ColumnSet = columns;

 

    var results = service.RetrieveMultiple(query);

ColumnSet(bool allColumns) constructor overload

    var query = new QueryExpression("account")

    {

        ColumnSet = new ColumnSet(true)

    };

 

    var results = service.RetrieveMultiple(query);

ColumnSet(bool allColumns) constructor overload for RetrieveRequest

    var entity = service.Retrieve("account"Guid.NewGuid(), new ColumnSet(true));


Guideline Examples 

 

ColumnSet(param string[] columns) constructor overload for QueryExpression

    var query = new QueryExpression("account")

    {

        ColumnSet = new ColumnSet("name""address1_city")

    };

 

    var results = service.RetrieveMultiple(query);

ColumnSet(param string[] columns) constructor overload for RetrieveRequest

    var entity = service.Retrieve("account"Guid.NewGuid(), new ColumnSet("name""address1_city"));

ColumnSet.AddColumn(string column) method call

    var query = new QueryExpression("account");

    query.ColumnSet.AddColumn("name");

    query.ColumnSet.AddColumn("address1_city");

 

    var results = service.RetrieveMultiple(query);

ColumnSet.AddColumns(param string[] columns) method call

    var query = new QueryExpression("account");

    query.ColumnSet.AddColumns("name""address1_city");

 

    var results = service.RetrieveMultiple(query);


Usage of the Retrieve method should set the columnSet parameter to a ColumnSet instance with specified columns.  Usage of QueryExpression should set the  QueryBase.ColumnSet property with the required attributes. 

The following messages contain reference a ColumnSet instance:

Message

ConvertQuoteToSalesOrderRequest Class 

ConvertSalesOrderToInvoiceRequest Class 

GenerateInvoiceFromOpportunityRequest Class 

GenerateQuoteFromOpportunityRequest Class 

GenerateSalesOrderFromOpportunityRequest Class 

QueryByAttribute Class 

QueryExpression Class 

RetrieveAllChildUsersSystemUserRequest Class 

RetrieveBusinessHierarchyBusinessUnitRequest Class 

RetrieveMembersTeamRequest Class 

RetrieveRequest Class 

RetrieveSubsidiaryTeamsBusinessUnitRequest Class 

RetrieveSubsidiaryUsersBusinessUnitRequest Class 

RetrieveTeamsSystemUserRequest Class 

RetrieveUnpublishedRequest Class 

RetrieveUserSettingsSystemUserRequest Class 

ReviseQuoteRequest Class 

SearchByBodyKbArticleRequest Class

 

Build queries with QueryExpression

In Microsoft Dynamics 365 (online & on-premises), you can use the QueryExpression class to programmatically build a query containing data filters and search conditions that define the scope of a database search. A query expression is used for single-object searches. For example, you can create a search to return all accounts that match certain search criteria. The QueryBase class is the base class for query expressions. There are two derived classes: QueryExpression and QueryByAttribute. The QueryExpression class supports complex queries. The QueryByAttribute class is a simple means to search for entities where attributes match specified values.

Query expressions are used in methods that retrieve more than one record, such as the IOrganizationService.RetrieveMultiple method, in messages that perform an operation on a result set specified by a query expression, such as BulkDeleteRequest and when the ID for a specific record is not known.

In addition, there is a new attribute on the organization entity, Organization.QuickFindRecordLimitEnabled. When this Boolean attribute is true, a limit is imposed on quick find queries. If a user provides search criteria in quick find that is not selective enough, the system detects this and stops the search. This supports a faster form of quick find and can make a big performance difference.


References:  

Use the QueryExpression Class

http://msdn.microsoft.com/en-us/library/gg334688.aspx

ColumnSet Class

http://msdn.microsoft.com/en-us/library/microsoft.xrm.sdk.query.columnset.aspx

Use of the ColumnSet Class

http://msdn.microsoft.com/en-us/library/gg309532.aspx

Dynamics Sure Step Methodology


Microsoft Dynamics Sure Step is a full customer lifecycle methodology for all Microsoft Dynamics® solutions, providing the Microsoft ecosystem with comprehensive implementations through delivery guidance, project management discipline alignment, and field-driven best practices. Sure Step is designed to enable the solution provider to better serve their customers by helping reduce their Microsoft Dynamics total cost of ownership. Sure Step content covers the Microsoft Dynamics ERP and CRM suite of solutions, including Microsoft Dynamics AX, Dynamics CRM, Microsoft Dynamics GP, Microsoft Dynamics NAV, and Microsoft Dynamics SL. The guidance, tools, templates, and best practices provided in the methodology can help increase the consistency, timeframes, quality, and success of Microsoft Dynamics engagements.

Sure Step is considered a full lifecycle methodology because it encompasses all phases of a customer engagement. Sure Step begins with a Solution Envisioning phase to help customers determine the right solution for their needs.  The Solution Envisioning phase is followed by Solution Delivery phases to implement their solution and to provide guidance for the operation and maintenance of the solution in production. For existing Microsoft Dynamics customers seeking to progress their solutions to the latest product releases, Sure Step also provides Upgrade Assessments in the Solution Envisioning phase, followed by Solution Delivery phases to upgrade their solution and then to maintain the production solution in operation.

Sure Step has six phases: Diagnostic, Analysis, Design, Development, Deployment, and Operation. The Diagnostic phase encompasses Solution Envisioning and provides guidance on product capabilities, including content on focus industries for a corresponding product. The Decision Accelerator Offering is an important part of the Diagnostic phase, designed to reduce the risks and concerns for the customers in their decision-making process for new/upgrade ERP/CRM solutions.


The Sure Step Methodology offers the project types described in the following table:

 

Project type

Description

Standard

A lean approach for implementing Microsoft Dynamics solutions at a single site.

Rapid

An accelerated approach for implementing Microsoft Dynamics solutions with minimal or no

customizations.

Enterprise

A standardized approach for implementing Microsoft Dynamics solutions in complex single-site deployments or in global/multi-site organizations wherein country/site-specific unique business needs have to be factored on top of a core solution.

Agile

An iterative approach to implementing Microsoft Dynamics solutions at a single site requiring specific features and moderate-to-complex customizations. While the Standard, Rapid, Enterprise, and Upgrade project types are waterfall-based, the Agile project type uses the Sprint cycle approach to solution deployment.

Upgrade

An approach to upgrade an existing Microsoft Dynamics solution to a subsequent release of that solution. This begins with a Technical Upgrade to address moving existing functionality to the subsequent release. Any new functionality that is desired can be deployed by using the one of the other project types: Rapid, Standard, Agile, or Enterprise.

Sure Step also features Cross Phase Processes that span the project types. A cross-phase process is a group of related activities that span multiple implementation phases in a specific project scenario. The Sure Step Methodology also provides Optimization Offerings that feature proactive and post go-live services that are designed to assist the customer and solution provider with an additional level of due diligence in the solution delivery lifecycle.

Additionally, Sure Step provides Project Management and Organizational Change Management libraries, with content to support these key functions in a solution delivery engagement. Sure Step also includes an overview of roles typically involved in an engagement, both from consulting (solution provider) and customer perspectives.

Note: Dynamics Sure Step methodology has a strong delivery guidance and toolset for managing an entire Dynamics CRM project, positioning Dynamics CRM as the main element of the solution and the methodology. Enterprise solutions usually consist of multiple products and even diverse technologies, making it challenging to apply the entire process. Applying the templates and recommendations of Sure Step should be always considered and made part of the specific chosen ALM method to lower the risks and to make the CRM delivery process more transparent.

Dynamics Sure Step lacks the guidance regarding the tooling and automation techniques for the specific processes;

the tooling should be always selected according to the specific delivery environment and requirements of the solution.

MSF-based Solution Delivery

The Microsoft Solutions Framework (MSF) provides an adaptable framework for successfully delivering information technology solutions faster and with fewer people and less risk while enabling higher-quality results. MSF helps teams directly address the most common causes of technology project failure to improve success rates, solution quality, and business impact. Created to deal with the dynamic nature of technology projects and environments, MSF fosters the ability to adapt to continual change within the course of a project.

MSF is called a framework instead of a methodology for specific reasons. As opposed to a prescriptive methodology, MSF provides a flexible and scalable framework that can be adapted to meet the needs of any project (regardless of size or complexity) to plan, build, and deploy business-driven technology solutions. The MSF philosophy holds that there is no single structure or process that optimally applies to the requirements and environments for all projects. It recognizes that, nonetheless, the need for guidance exists. As a framework, MSF provides this guidance without imposing so much prescriptive detail that its use is limited to a narrow range of project scenarios.


MSF components can be applied individually or collectively to improve success rates for projects such as:

§             Software development projects, including mobile, web and e-commerce applications, web services, mainframe, and n-tier

§             Infrastructure deployment projects, including desktop deployments, operating system upgrades, enterprise messaging deployments, and configuration and operations management systems deployments

§             Packaged application integration projects, including personal productivity suites, enterprise resource planning (ERP), and enterprise project management solutions

§             Any complex combination of the above

MSF guidance for these different project types focuses on managing the “people and process” as well as the technology elements that most projects encounter. Because the needs and practices of technology teams are constantly evolving, the materials gathered into MSF are continually changing and expanding to keep pace.

As a framework, MSF contains multiple components that can be used individually or adopted as an integrated whole. Collectively, they create a solid yet flexible approach to the successful execution of technology projects. These components are described in the following table.

 

MSF component

Description

MSF foundational principles

The core principles upon which the framework is based. They express values and standards that are common to all elements of the framework.

MSF models

Schematic descriptions or “mental maps” of the organization of project teams and processes (Team Model and Process Model—two of the major defining components of the framework).

MSF disciplines

Areas of practice using a specific set of methods, terms, and approaches (Project Management, Risk Management, and Readiness Management: the other major defining components of the framework).

MSF key concepts

Ideas that support MSF principles and disciplines and are displayed through specific proven practices.

MSF proven practices

Practices that have been proven effective in technology projects under a variety of real-world conditions.

MSF recommendations

Optional but suggested practices and guidelines in the application of the models and discipline.

The MSF Process Model combines concepts from the traditional waterfall and spiral models to capitalize on the strengths of each model. The Process Model combines the benefits of milestone-based planning from the waterfall model with the incrementally iterating project deliverables from the spiral model.

The Process Model phases and activities appear in the following list:

§             Envision: Describe the solution concept and define the project team necessary to deliver it.

§             Plan: Assemble detailed plans and designs necessary to deliver the solution.

§             Build: Construct a solution that includes all aspects of the project needs.

§             Stabilize: Polish and verify that the solution meets customer and user needs and expectations.

§             Deploy: Deploy and integrate the solution to its intended production environments.


The MSF Process Model is depicted in the following graphic:


Iterative Solution Development

 

Iterative Solution Development (ISD) is a methodology used to reduce solution delivery risk and highlight Microsoft’s deep experience building custom application-development solutions. ISD enables on-going productive customer feedback, a single system of record for improved traceability, and consistent guidance on tools and application development recommended practices.

§             ISD is recommended for extremely complex, custom-development solutions.

§             ISD is the Microsoft Services Solution Delivery (SSD) approach for envisioning, planning, stabilizing, and deploying complex custom application development solutions.

§             ISD is derived from Services Delivery Methodology (SDM), which is based on Microsoft Solution Framework (MSF) and sourced from the World Wide Solution Development Centers. ISD is used when Microsoft is the prime contractor for large, complex, and custom application development engagements.

§             ISD leverages five core pillars to ensure delivery: a team model, a mentoring model, a process model, a governance model, and guidance focused on management of the development environment (built on TFS).


ISD Phases

§             The ISD Discovery Phase provides detailed guidance on all technical and business pre-sales activities required to win large, complex, and custom application development Tier 1 deals.

§             The ISD Sketch Phase provides detailed guidance on all of the activities required to successfully deliver a solid statement of work (SOW) for the Build, Stabilize, and Deploy phases. The ISD Sketch Phase is designed to be used on large, complex, and custom application development Tier 1 engagements.

§             The ISD Build & Stabilize Phase is the process of constructing the solution for the customer. This is the main Delivery Management phase. The iteration plan developed during Sketch is executed delivering working features and capabilities for each Release. It is built on the ISD team model and leverages TFS tooling and automation, covering development and test processes, source and version control policies, and testing and QA methods to ensure the ISD goal of high quality solution.

§             The ISD Deploy Phase provides detailed guidance on all the activities required to successfully release custom application development solutions into production. The ISD Release Phase is designed to be used on large, complex, and custom application development Tier 1 engagements.

§             The ISD Support Phase provides detailed guidance on all the activities required to successfully support custom application development solutions once they have been deployed into production. The ISD Support Phase is designed to be used on large, complex, and custom application development Tier 1 engagements.

 

Microsoft Dynamics CRM 2016 Service Pack 1 Performance Benchmark on Azure Infrastructure as a Service (IaaS)

You can download performance paper of Microsoft Dynamics CRM 2016 Service Pack 1 (SP1) running on Azure Virtual Machines. 


Overview 

Microsoft Dynamics CRM is designed to deliver intelligent customer engagement to the market – helping companies deliver customer experiences that are personalized, proactive and predictive. Dynamics CRM helps provide data anywhere and across a wide array of devices, ranging from phones and tablets to PCs, and through a wide array of client types, such as smartphone apps, tablet apps, and Microsoft Dynamics CRM for Outlook. This paper highlights the scalability and performance that can be achieved in terms of concurrent users and feature functionality with the latest release of Dynamics CRM 2016 SP1, running on the standard “off the shelf” Azure Virtual Machines. 



Conclusion 

The results reflect the scalability and performance achieved on a specific Dynamics CRM 2016 SP1 implementation running on standard Azure Virtual Machines in a test environment. Actual performance may vary based on factors ranging from specific customizations deployed to the geographic distribution of users and network latency or bandwidth. Customers should expect that results will vary for each implementation and should perform their own performance testing based on their needs or requirements. In some cases, customers may achieve higher levels of performance by fine-tuning or optimizing the configuration of Microsoft Dynamics CRM. 

These results demonstrate the robustness of Dynamics CRM 2016 SP1 and its capability to handle concurrent user activities with ease for enterprise CRM scenarios. 

You can download from below link:

Microsoft Dynamics CRM 2016 SP1 Performance Benchmark on Azure IaaS.pdf (1,1MB)

Join Us at Summit EMEA 2017

I'm pleased to announce that I'm one of the presenters in Summit EMEA 2017 which will be held 4-6 April at the RAI Amsterdam.

About my session:

Development on Dynamics 365/CRM
Tuesday, April 4
 | 2:00 PM - 3:00 PM
 | Room: G110

Format: Presentation
Level: Intermediate

Looking to extend or write your first code to your Dynamics CRM environment? This session is focused on those new to CRM development or CRM administrators interested in taking the plunge to "code" customization. Covering all development structure of the Dynamics platform since version 2011. Attendees can easily see the difference between versions from a development perspective and particularly helpful for those who work on upgrade projects.

About Summit EMEA:

Summit EMEA 2017 will be held 4-6 April at the RAI Amsterdam. Summit EMEA provides Microsoft Dynamics users with best-in-class education on how to maximise the performance of Microsoft CRM, Dynamics AX, Dynamics NAV, and Power BI products. This event welcomes IT executives and end users from organizations using Microsoft Dynamics for learning and collaboration in a trustworthy environment.

This community-driven conference features special access to Microsoft leadership and a diverse array of interactive training workshops led by experts and users. Session topics include:

·         BI & Reporting

·         CRM Cloud Extensions

·         Developer

·         Leadership & User Adoption

·         Microsoft Dynamics 365

·         Power User

·         User Showcase

·         Partner Solutions Showcase

Sessions are currently available online. View session details, descriptions, and speaker information.

Interested in Power BI? There will also be three full days of Power BI sessions available to attendees. View Power BI sessions.

This event is brought to you by AXUG, CRMUG, NAVUG, and PBIUG. These User Groups are the world's most influential communities of Dynamics users, business leaders, IT professionals, developers, and partners - with members sharing a common goal to maximize and advance the performance of their Dynamics investment.

Register today! Begin your conference registration.