Add a UserAgent to the IHttpClientFactory in .NET Core

Using a IHttpClientFactory to create HttpClient connections have a number of advantages, as you can configure several httpclients on startup. Each client will be reused, including the properties attached to that client.

In a previous post I showed how to create a Polly retry mechanism for a HttpClient. Adding a UserAgent to a HttpClient is even easier.

In the ConfigureServices()  (in the Startup.cs file), add the following code:

services.AddHttpClient("HttpClient", 
  client => 
  client.DefaultRequestHeaders.UserAgent.ParseAdd("my-bot/1.0")
);

This imaginary image service will get an image using the “HttpClient” connection. Every time a GET request is made, the UserAgent will be “my-bot/1.0“:

using System;
using System.Linq;
using System.Net.Http;
using System.Net.Http.Headers;

namespace MyCode
{
  public class ImageService
  {
    private readonly IHttpClientFactory _clientFactory;

    public ImageService(IHttpClientFactory clientFactory)
    {
      _clientFactory = clientFactory;
    }

    public async Task<string> GetImage(string imageUrl)
    {
      try
      {
        var httpClient = _clientFactory.CreateClient("HttpClient");
        using var response = await httpClient.GetAsync(imageUrl);
        if (!response.IsSuccessStatusCode)
          throw new Exception($"GET {imageUrl} returned {response.StatusCode}");
        if (response.Content.Headers.ContentLength == null)
          throw new Exception($"GET {imageUrl} returned zero bytes");
        // ...
        // Do something with the image being fetched
        // ...
      }
      catch (Exception exception)
      {
        throw new Exception($"Failed to get image from {imageUrl}: {exception.Message}", exception);
      }
    }
  }
}

MORE TO READ:

Posted in .net, .NET Core, c# | Tagged , , | Leave a comment

Sending JSON with .NET Core QueueClient.SendMessageAsync

In .NET Core, Microsoft.Azure.Storage.Queue have been replaced with Azure.Storage.Queues, and the CloudQueueMessage that you added using queue.AddMessageAsync() have been replaced with the simpler queue.SendMessageAsync(string) method.

But this introduces a strange situation, when adding serialized JSON objects. If you just add the serialized object to the queue:

using Azure.Storage.Queues;
using Newtonsoft.Json;
using System;

public async Task SendObject(object someObject)
{
  await queueClient.SendMessageAsync(JsonConvert.SerializeObject(someObject));
}

The queue cannot be opened from Visual Studio. You will get an error that the string is not Base 64 encoded.

System.Private.CoreLib: The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters.

So you need to Base 64 encode the serialized object before adding it to the queue:

using Azure.Storage.Queues;
using Newtonsoft.Json;
using System;

public async Task SendObject(object someObject)
{
  await queueClient.SendMessageAsync(Base64Encode(JsonConvert.SerializeObject(someObject)));
}

private static string Base64Encode(string plainText)
{
  var plainTextBytes = System.Text.Encoding.UTF8.GetBytes(plainText);
  return System.Convert.ToBase64String(plainTextBytes);
}

When reading the serialized JSON string, you do not need to Base 64 decode the string, it will be directly readable.

MORE TO READ:

Posted in .NET Core, c#, General .NET, Microsoft Azure | Tagged , , , | Leave a comment

Sitecore Publish item when moved or dragged using uiMoveItems and uiDragItemTo pipelines

Sometimes you have items that needs to be published immediately if moved to a new location in the content tree. Sitecore supports this – of course – via the uiMoveItems and uiDragItemTo pipelines.

Move item to new location

Move item to new location

This technique really applies to whatever you wish to do with items that are moved or dragged to a new location.

You can move an item in Sitecore 2 ways, either by clicking the “Move to” button or by simply dragging an item to a new location. There are 2 separate pipelines handling these actions. The uiMoveItems handles the button click, and the uiDragItemTo handles the drag operation. Both args are almost the same, but not quite, which is why we need to entrances to the actual method.

But enough talk, lets code. The function that asks for the item to be published looks like this:

using Sitecore;
using Sitecore.Configuration;
using Sitecore.Data;
using Sitecore.Data.Items;
using Sitecore.Publishing;
using Sitecore.Diagnostics;
using Sitecore.Web.UI.Sheer;
using System;
using System.Collections.Generic;
using System.Linq;

namespace MyCode
{
  public class ItemMoved
  {
    // Only do something if it's this particular
    // item type that is moved. Change this to your Template ID
    private static ID _TEMPLATE_ID = "some item id";

    // Entrance for the UiMoveItems pipeline
    public void UiMoveItems(ClientPipelineArgs args)
    {
      DoProcess(args, "target", "items");
    }

    // Entrance for the UiDragItemTo pipeline
    public void UiDragItemTo(ClientPipelineArgs args)
    {
      DoProcess(args, "target", "id");
    }

    // The actual method
    private void DoProcess(ClientPipelineArgs args, string targetParam, string sourceParam)
    {
      Assert.ArgumentNotNull(args, "args");

      // Get the master database from the args
      Database db = Factory.GetDatabase(args.Parameters["database"]);
      Assert.IsNotNull(db, "db");

      // Get the target item we are moving to
      Item targetItem = GetTargetItem(args, db, targetParam);
      Assert.IsNotNull(targetItem, "targetItem");

      // Get the source items being moved. The first item 
      // is the root item of the items moved.
      IEnumerable<Item> sourceItems = GetSourceItems(args, db, sourceParam);
      Assert.IsNotNull(sourceItems, "sourceItems");
      Assert.IsTrue(sourceItems.Count() != 0, "sourceItems are empty");
      Item sourceItem = sourceItems.First();

      if (!args.IsPostBack)
      {
        // No one clicked anything yet. Check if it's the item
        // in question that is being moved
        if (sourceItem.TemplateID == _TEMPLATE_ID)
        {
          // If the item is not published at the moment, ignore the item
          if (!sourceItem.Publishing.IsPublishable(DateTime.Now, false))
            return;
          // The item is published. Ask the user to publish the item for them
          SheerResponse.Confirm($"You have moved {sourceItem.Name}. You need to publish the item immediately. Would you like to publish it now?");
          args.WaitForPostBack();
          return;
        }
        return;
      }

      Context.ClientPage.Modified = false;
      if (args.Result == "yes")
      {
        // The user clicked "yes" to publish the item. Publish the item now.
        PublishOptions publishOptions = new PublishOptions(sourceItem.Database, Database.GetDatabase("web"), publishMode, sourceItem.Language, DateTime.Now);
        publishOptions.RootItem = sourceItem;
        publishOptions.Deep = true;
        publishOptions.PublishRelatedItems = false;
        publishOptions.CompareRevisions = false;
        var handle = PublishManager.Publish(new PublishOptions[] { publishOptions });
        PublishManager.WaitFor(handle);     
      }
      return;
    }

    // Returns the item we move to
    private Item GetTargetItem(ClientPipelineArgs args, Database db, string paramName)
    {
      var targetId = args.Parameters[paramName];
      Assert.IsNotNullOrEmpty(targetId, "targetId");
      var targetItem = db.GetItem(targetId);
      return targetItem;
    }

    // Returns the items we move
    private IEnumerable<Item> GetSourceItems(ClientPipelineArgs args, Database db, string paramName)
    {
      var sourceIds = args.Parameters[paramName].Split('|').ToList();
      Assert.IsTrue(sourceIds.Any(), "sourceIds.Any()");
      var sourceItems = sourceIds.Select(id => db.GetItem(id)).ToList();
      return sourceItems;
    }
  }
}

The method needs to be hooked up in your pipelines:

<?xml version="1.0" encoding="utf-8"?>
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">
    <sitecore>
      <processors>
        <uiMoveItems>
          <processor patch:after="*[@method='RepairLinks']" mode="on" type="MyCode.ItemMoved, MyDll" method="UiMoveItems" />
        </uiMoveItems>
        <uiDragItemTo>
          <processor patch:after="*[@method='RepairLinks']" mode="on" type="MyCode.ItemMoved, MyDll" method="UiDragItemTo" />
        </uiDragItemTo>
      </processors>
    </sitecore>
</configuration>

Did you notice how the ItemMoved method contains 2 entry methods, UiMoveItems and UiDragItemTo? This is because the parameters are not the same when pressing the move to button and when dragging. The targetItem is stored in 2 different parameters (items vs id).

That’s it. Happy coding.

MORE TO READ:

Posted in c#, Sitecore 6, Sitecore 7, Sitecore 8, Sitecore 9 | Tagged , , , | Leave a comment

Run tasks in parallel using .NET Core, C# and async coding

If you have several tasks that can be run in parallel, but still need to wait for all the tasks to end, you can easily achieve this using the Task.WhenAll() method in .NET Core.

Imagine you have this imaginary method that takes a URL and uploads the file to some location:

private async Task UploadFile(string fileName)
{
  // pseudo code, you just need to imagine
  // that this metod executes a task
  if (file exists)
    await _fileRepository.UploadFile(fileName);
}

RUN ONCE:

This method can be called from your main method:

private static async Task Main(string[] args)
{
  await UploadFile("c.\\file.txt");
}

RUN IN SEQUENCE:

If you have 2 files to be uploaded you can call it twice:

private static async Task Main(string[] args)
{
  await UploadFile("c.\\file.txt");
  await UploadFile("c.\\file2.txt");
}

This will upload the first file, then the next file. There is no parallelism here, as the “async Task” does not automatically make something run in in parallel.

RUN IN PARALLEL:

But with Task.WhenAll() you can run both at the same time in parallel:

private static async Task Main(string[] args)
{
  var task1 = UploadFile("c.\\file.txt");
  var task2 = UploadFile("c.\\file2.txt");
  await Task.WhenAll(task1, task2);
}

This will spawn 2 threads, run them simultaneously, and return when both threads are done.

RUN IN PARALLEL THE FLEXIBLE WAY:

If you want even more flexibility, you can call it using an IEnumerable list of objects:

private static async Task Main(string[] args)
{
  List<string> fileNames = new List<string>();
  fileNames.Add("c:\\file.txt");
  fileNames.Add("c:\\file2.txt");
  var tasks = fileNames.Select(f => UploadFile(f));
  await Task.WhenAll(tasks);
}

This will create a list of Tasks to be run at the same time. You can add many filename to the fileNames list and have them run, each of them in their own thread.

RUN IN PARALLEL IN BATCHES:

Beware of the limitations of threading. Spawning a thread have a small but significant overhead, and running too many threads at once could be slower than running them in sequence. If you have 100’s of files to be uploaded, you should run the tasks in batches:

private static async Task Main(string[] args)
{
  List<string> fileNames = new List<string>();
  fileNames.Add("c:\\file.txt");
  fileNames.Add("c:\\file2.txt");
  // ... adding 100's of files

  var batchSize = 10;
  int batchCount = (int)Math.Ceiling((double)userIds.Count() / batchSize);
  for(int i = 0; i < batchCount; i++)
  {
    var filesToUpload = fileNames.Skip(i * batchSize).Take(batchSize);
    var tasks = filesToUpload.Select(f => UploadFile(f));
    await Task.WhenAll(tasks));
  }
}

This will spawn 10 threads and wait for them to finish before taking the next 10.

MORE TO READ:

Posted in .NET Core, c# | Tagged , , , | Leave a comment

Using full Lucene Query Syntax in Azure Search

The Azure Cognitive Search is the search engine in Microsoft Azure. You can search using a simple queries (default) which is good at doing full text searches, or you can use full syntax which is a Lucene query syntax. The full syntax is good at searching for specific values in specific fields.

GOOD TO KNOW: SEARCHABLE VS FILTERABLE

Not all fields can be searched, and not all fields are searched the same way. Your field needs to be “facetable” (great for GUID’s and other ID’s that you do exact search on) or “searchable” (great for text) in order for the field to be searched. If your field is “filterable” (great for booleans and other exact values), you need to specify the search differently for this field.

Search Index Example

Notice how the fields have different search properties

In my examples, I will search using the “AllCategoryIDs” and “CustomerName” fields, and filter using the “IsFeed” field.

THE SEARCH SYNTAX:

You can test out your searches using the POST endpoint in Azure Search.

Do a POST to:

https://[yourazure]/indexes/[yourindex]/docs/search?api-version=2020-06-30
Headers:
api-key: The API key of the index
Content-Type: application/json

Content:

{
  "search": "field:value",
  "filter": "field eq true",
  "queryType": "full",
  "searchMode": "all"
}

Replace “field” with the name of the field, and “value” with the name of the value.

Notice how the syntax is different for search and filter? The “search” field is the actual Lucene Query Syntax, while the “filter” is the simple syntax. Why is it so? I don’t know.

SEARCH EXAMPLE: GIVE ME ALL CUSTOMERS WITH NAME “MICROSOFT” OR “APPLE”:

{
  "search": "CustomerName:Microsoft or CustomerName:Apple",
  "queryType": "full",
  "searchMode": "all"
}

SEARCH EXAMPLE: ALWAYS RETURN “MICROSOFT” AND ALL CUSTOMERS WITH A CERTAIN CATEGORY GUID:

{
  "search": "AllCategoryIds:0baa80ca-a16e-4823-823e-06a11ddd2310 OR CustomerName:Microsoft",
  "queryType": "full",
  "searchMode": "all"
}

SEARCH EXAMPLE: GIVE ME ALL CUSTOMERS WITH NAME “MICROSOFT” WHERE ISFEED IS FALSE:

{
  "search": "CustomerName:Microsoft",
  "filter": "IsFeed eq false",
  "queryType": "full",
  "searchMode": "all"
}

HOW DO TO SEARCH FROM C#

This is a small example on how to use Microsoft Azure Search NuGet Package to do a full lucene query search:

using Microsoft.Azure.Search;
using Microsoft.Azure.Search.Models;

void Search()
{
  // Get a search client
  SearchServiceClient searchServiceClient = new SearchServiceClient("accountname", new SearchCredentials("apikey"));
  // Get an index from the search client
  ISearchIndexClient indexClient = searchServiceClient.Indexes.GetClient("indexname");
  
  // Create the search parameters
  SearchParameters searchParameters = new SearchParameters();
  searchParameters.QueryType = QueryType.Full;
  searchParameters.SearchMode = SearchMode.All;
  searchParameters.IncludeTotalResultCount = true;
  // Optional filter
  searchParameters.Filter = "IsFeed eq false";
  // The actual query
  string queryText = "CustomerName:Microsoft";
  
  // Do the search
  DocumentSearchResult<Document> documentSearchResult = indexClient.Documents.Search(queryText, searchParameters);
  foreach (SearchResult<Document> searchResult in documentSearchResult.Results)
  {
    // Do stuff with the search result
  }
}

MORE TO READ:

Posted in .NET Core, Microsoft Azure | Tagged , , , , | Leave a comment

Sitecore field level Security – give write access to members of a certain group

The Sitecore security model is pretty straight forward, but as everything security, it can become complicated.

This goes for field level security. For a certain field, I wish to grant read access to everyone, but write access only to members of my “Price Administrator” role.

STEP 1: THE SETUP

First, create the new role:

Add Sitecore Role

Sitecore Role

Select the field that needs to have the access modified, and select “Assign security”

Field to grant access

Field to grant access

For the “sitecore\everyone” role, grant “field read” access, but deny inheritance. It is important that you deny inheritance, because if you do not, no other role can grant access to the field, and everyone but administrators will have denied access:

Everyone has read access, but denied inheritance

Everyone Role

For the “sitecore\Price Administrator“, grant “field write” access:

Price Administrator have field write access

Price Administrator Role

STEP 2: THE TEST

Go to a page that uses the field. Ordinary users (non-admins) will see the field, but it is read-only:

Field is read-only

Field is read-only

Then grant the role to your Sitecore user:

Price Administrator Role is added to the user

Price Administrator Role is added to the user

… and the user have write access:

User has write access to field

User has write access to field

MORE TO READ:

Posted in Sitecore 6, Sitecore 7, Sitecore 8, Sitecore 9 | Tagged , , | Leave a comment

Method not found: ‘Void Sitecore.ContentSearch.Diagnostics.AbstractLog.SingleWarn(System.String, System.Exception)’.

I struggled with this error in my development environment:

Method not found: ‘Void Sitecore.ContentSearch.Diagnostics.AbstractLog.SingleWarn(System.String, System.Exception)’.

at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor)
at System.Reflection.RuntimeMethodInfo.UnsafeInvokeInternal(Object obj, Object[] parameters, Object[] arguments)
at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture)
at Sitecore.ContentSearch.SolrProvider.LinqToSolrIndex`1.Execute[TResult](SolrCompositeQuery compositeQuery)
at Sitecore.ContentSearch.Linq.QueryableExtensions.GetResults[TSource](IQueryable`1 source)

After an hour of so of debugging and not understanding that the error is only in my development environment, and not in production, I did a full rebuild, and lo and behold – the error dissapeared?

Well, it turns out that I had the wrong version of the Sitecore.ContentSearch.dll in my development environment. Some NuGet reference had overwritten the correct hotfix dll 3.1.1-r00161 Hotfix 206976-1 with an older version 3.1.1-r00161, and this version does apparently not contain the AbstractLog.SingleWarn(System.String, System.Exception) method.

Morale: If the error message state that your method is missing, it could be true. Check your dependencies before panicking.

Sitecore.ContentSearch.dll

Sitecore.ContentSearch.dll

MORE TO READ:

 

Posted in .net, c#, Sitecore 8, Sitecore 9 | Tagged , , | Leave a comment

C# Set local folder for .net Core Windows Services

When developing .NET Core Worker Services, you can allow the service to run as Windows Service:

public static IHostBuilder CreateHostBuilder(string[] args)
{
  var host = Host.CreateDefaultBuilder(args);
  host.UseWindowsService();
  ...
  ...

The side effect is that the root folder changes from the local folder to the System32 folder, which means that any log files that you would expect to find in your local folder suddenly ends up in another folder.

The fix is easy, simple add the following to the main function of your application:

public static void Main(string[] args)
{
  Directory.SetCurrentDirectory(AppDomain.CurrentDomain.BaseDirectory);
  CreateHostBuilder(args).Build().Run();
}

SetCurrentDirectory will then rebase the local folder to the base directory of your application, and your log files will be written to the local folder.

MORE TO READ:

 

Posted in .NET Core, c# | Tagged , , | 1 Comment

Manipulating XML Google Merchant Data using C# and LINQ

Receiving a Google Merchant Data feed (also known as a Google Product Feed) can be fairly easily manipulated on import time using a little C# and LINQ.

The feed is basically a XML RSS 2.0 feed with some added properties using the namespace xmlns:g=”http://base.google.com/ns/1.0.

These feeds often comes from older systems and data is created by busy merchants, so data can be relatively dirty, and a cleanup is required before you add them to your product database.

The feed could look like this:

<?xml version="1.0" encoding="utf-8" ?>
<rss version="2.0" xmlns:g="http://base.google.com/ns/1.0">
    <channel>
        <title>Google product feed</title>
        <link href="https://pentia.dk" rel="alternate" type="text/html"/>
        <description>Google product feed</description>
        <item>
            <g:id><![CDATA[1123432]]></g:id>
            <title><![CDATA[Some product]]></g:title>
            <link><![CDATA[https://pentia.dk]]></g:link>
            <g:description><![CDATA[description]]></g:description>
            <g:gtin><![CDATA[5712750043243446]]></g:gtin>
            <g:mpn><![CDATA[34432-00]]></g:mpn>
            <g:image_link><![CDATA[https://pentia.dk/someimage.jpg]]></g:image_link>
            <g:product_type><![CDATA[Home &gt; Dresses &gt; Maxi Dresses]]></g:product_type>
            <g:condition><![CDATA[new]]></g:condition>
            <g:availability><![CDATA[in stock]]></g:availability>
            <g:price><![CDATA[15.00 USD]]></g:price>
            <g:sale_price><![CDATA[10.00 USD]]></g:sale_price>
        </item>
        ...
        ...
    </channel>
</rss>

See the full specification in the Google Merchant Center help.

Sometimes the feed would contain content that you does not need, and a little XML manipulation is required.

But first thing first:

STEP 1: GET THE XML FEED AND CONVERT IT INTO AN XML DOCUMENT

using System;
using System.Net;
using System.Net.Http;
using System.Xml;
using System.Xml.Linq;
using System.Linq;
using System.Dynamic;

private static HttpClient _httpClient = new HttpClient();

public static async Task<string> GetFeed(string url)
{
  using (var result = await _httpClient.GetAsync($"{url}"))
  {
    string content = await result.Content.ReadAsStringAsync();
    return content;
  }
}

public static void Run()
{
  // Get the RSS 2.0 XML data
  string feedData = GetData("https://url/thefeed.xml").Result;

  // Convert the data into an XDocument
  var document = XDocument.Parse(feedData);
  // Speficy the Google namespace
  XNamespace g = "http://base.google.com/ns/1.0";
  // Get a list of all "item" nodes
  var items = document.Descendants().Where(node =&amp;gt; node.Name == "item");
    
  // Now we are ready to manipulate
  // ...
  // ...
}

NOW TO THE MANIPULATIONS:

EXAMPLE 1: Remove duplicates – all products with the same ID is removed:

items.GroupBy(node => node.Element(g+"id").Value)
  .SelectMany(node => node.Skip(1))
  .Remove();

EXAMPLE 2: Remove all products out of stock:

items = document.Descendants()
  .Where(node => node.Name == "item" 
         && node.Descendants()
         .Any(desc => desc.Name == g + "availability" 
              && desc.Value == "out of stock"));
items.Remove();

EXAMPLE 3: Remove adverts not on sale (all adverts that do not have a g:sale_price node)

items = document.Descendants()
  .Where(node => node.Name == "item" 
         && node.Descendants()
         .Any(desc => desc.Name == g + "sale_price" 
         && desc.Value.Trim() == string.Empty));
items.Remove();

EXAMPLE 4: ADD TRACKING PARAMETERS TO URL’S (adding query string parameters to the URL)

var items = document.Descendants().Where(node => node.Name == "item");
foreach (var item in items)
{
  string url = item.Element("link").Value;
  if (url.Contains("?"))
    item.Element("link").ReplaceNodes(new XCData(url + "&" + "utm_source=s&utm_medium=m&utm_campaign=c"));
  else  
    item.Element("link").ReplaceNodes(new XCData(url + "?" + "utm_source=s&utm_medium=m&utm_campaign=c"));
}

EXAMPLE 5: CHANGE THE TITLE (for example, if the feed contains used products, you might want to add the word “used” to the title

var items = document.Descendants().Where(node => node.Name == "item");
foreach (var item in items)
{
  var title = "USED " + item.Element("title").Value;
  item.Element("title").ReplaceNodes(title);
}

…AND THE EXOTIC EXAMPLE: COMBINE ALL PRODUCTS IF THEY BELONG TO A PRODUCT_TYPE THAT CONTAIN LESS THAN 2 PRODUCTS

foreach(var group in items.GroupBy(node => node.Element(g+"product_type").Value))
{
  if (group.Count() <= 2)
  {
    foreach (var advert in group)
    {
      advert.Element(g+"product_type").ReplaceNodes(new XCData("Other"));
    }
  }
}

Finally you can grab the manipulated document and do what you need to do:

// Grab converted content
string convertedFeedData = document.ToString();

I hope this gives some examples on how to do much with less code.

MORE TO READ:

Posted in .net, .NET Core, c#, General .NET | Tagged , , , , , | Leave a comment

C# Azure TelemetryClient will leak memory if not implemented as a singleton

I noticed that my classic .net web application would leak memory after I implemented metrics for some background tasks.

Memory usage of web application

Memory usage of web application

Further investigation showed that my MetricAggregationManager would not release its memory.

Object was not garbage collected

Object was not garbage collected

Since one of the major changes was the implementation of a TelemetryClient, and since the memory not being released was from the Microsoft.ApplicationInsights.Metrics namespace, I concluded that the problem lies within the creation of the TelemetryClient:

using System;
using Microsoft.ApplicationInsights;
using Microsoft.ApplicationInsights.Metrics;

namespace MyCode
{
  public class BaseProcessor
  {
    private readonly TelemetryClient _telemetryClient;
    
    private BaseProcessor()
    {
      string instrumentationKey = "somekey"
      var telemetryConfiguration = new TelemetryConfiguration { InstrumentationKey = instrumentationKey };
      // This is a no-go. I should not create a new instance for every BaseProcessor
      _telemetryClient = new TelemetryClient(telemetryConfiguration);
    }
  }
}

The code above will create a new TelemetryClient for each creation of my base class. The TelemetryClient will collect metrics and store those in memory until either a set time or number of metrics are met, and then dump the metrics to Application Insights.

So when the BaseClass is disposed, TelemetryClient is not, leaving memory to hang, and thus a memory leak is in effect.

HOW TO SOLVE IT?

The solution is simple. All you need to do is to create a singleton pattern for your TelemetryClient. Having only one instance will allow the client to collect and send metrics in peace. Your code will be much faster (it takes a millisecond or so to create a TelemetryClient) and you will not have any memory leaks.

USE DEPENDENCY INJECTION:

In .NET Core you can add the TelemetryClient to the service collection:

private static void ConfigureServices(IServiceCollection services)
{
  // Add Application Insights
  var telemetryConfiguration = TelemetryConfiguration.CreateDefault();
  telemetryConfiguration.InstrumentationKey = "somekey"
  var telemetryClient = new TelemetryClient(telemetryConfiguration);
  services.AddSingleton(telemetryClient);
}

And then reference it using constructor injection:

using System;
using System.Runtime.Serialization;
using Microsoft.ApplicationInsights;
using Microsoft.AspNetCore.Mvc;

namespace MyCode
{
  [ApiController]
  [Route("/api/[controller]")]
  [Produces("application/json")]
  public class MyController : ControllerBase
  {
    private readonly TelemetryClient _telemetryClient;

    public MyController(TelemetryClient telemetryClient)
    {
      _telemetryClient = telemetryClient;
    }
  }
}

USE A STATIC VARIABLE:

If you do not have access to a DI framework, you could also just create a static variable:

using Microsoft.ApplicationInsights;
using Microsoft.ApplicationInsights.Extensibility;
using System.Collections.Generic;

namespace MyCode
{
  public static class TelemetryFactory
  {
    private static TelemetryClient _telemetryClient;

    public static TelemetryClient GetTelemetryClient()
    {
      if (_telemetryClients == null)
      {
        string instrumentationKey = "somekey";
        var telemetryConfiguration = new TelemetryConfiguration { InstrumentationKey = instrumentationKey };
        _telemetryClient = new TelemetryClient(telemetryConfiguration);
      }

      return _telemetryClient;
    }
  }
}

And then reference the static variable instead:

using System;
using Microsoft.ApplicationInsights;
using Microsoft.ApplicationInsights.Metrics;

namespace MyCode
{
  public class BaseProcessor
  {
    private readonly TelemetryClient _telemetryClient;
    
    private BaseProcessor()
    {
      _telemetryClient = TelemetryFactory.GetTelemetryClient();
    }
  }
}

MORE TO READ:

 

Posted in .net, .NET Core, c#, General .NET, Microsoft Azure | Tagged , , , | Leave a comment