Some Great Innovations in .NET 7

As we know .NET Conf 2022 took place between 8-10 November. During the conference, exciting innovations and performance improvements around .NET 7 and C# 11 were discussed. In this release, we, the developers, focused on great topics such as the ability to develop faster, lighter and easier cloud-native applications.

In this article, I will try to mention some innovations that I like.

NOT: First of all, if you don’t have the .NET 7 release yet, you can download it here.

As in every .NET release, it was said that this release had a performance improvement at a level that could be called major, or even at the best level. Really good job on performance. A significant part of these improvements have been made by JIT and I would like to briefly touch upon this issue.

As we know, JIT is responsible for translating and managing MSIL code to native code in runtime. It performs many different optimizations, taking into account the environment/process in the background, in order for our applications to work efficiently. As we might expect, such just-in-time performance optimizations are time-consuming in nature and some tradeoffs. For example, when JIT does not make optimizations or is not fully implemented, the application’s start-up time may accelerate, but its functionality, namely throughput, may decrease. In order to reduce such tradeoffs, JIT has started to use Tiered Compilation by default since .NET Core 3 as we know. Thus, in order to achieve a better performance optimization, JIT can perform hot-swap operation in runtime by recompiling the related methods more than once according to the usage statistics, instead of compile the related methods only once.

With .NET 7, these tradeoffs that JIT tried to reduce were tried to be completely handled by making use of the On-stack replacement technique in the performance improvements made on the JIT side. Thus, it is ensured that JIT can perform optimizations not only between method invocations, but even while the relevant method is running.

Apart from these, Threading, Networking, Collections, LINQ etc. great performances have been made at many important points such as In short, with the transition to .NET 7 version, we will be able to achieve a nice performance gain by default.

Console Application’lar için Native AOT

First of all, I would like to start with the Native Ahead-of-time (AOT) issue that excites me. As we know, the .NET team has been working on Native AOT for a while, and with .NET 7, they announced that they would remove these studies from their experimental status and include them in mainline development. With this release, Native AOT is now officially with us for console applications and class libraries.

Native AOT briefly creates the relevant code as “native” in compile-time instead of run-time. In short, when publishing the application, it compiles the relevant IL code into native code according to the specified runtime. Thus, Native AOT applications do not need JIT while running. In other words, we can run our Native AOT applications in environments without .NET runtime. Of course, although this situation has been presented to us under different features before, for example, “Ready to Run”, this concept has been brought to a better point with Native AOT.

The benefits are briefly;

  • We can say that it removes the need for JIT. (Of course, this subject is a bit controversial when it comes to runtime performance rather than start-up time. As we know, JIT compiler analyzes the environment it is in and provides the best optimization process for our code.)
  • It speeds up the application start up time considerably.
  • It provides less memory usage.
  • When it is compiled, it reduces the disk size of the application considerably compared to “self-contained” publishing.
  • It also enables the development of native libraries that can be used by different programming languages.

While Native AOT is exciting, it unfortunately has some limitations.

  • If you need runtime code generation (System.Reflection.Emit), unfortunately it cannot be used with Native AOT.
  • It cannot be performed dynamically in loading (Assembly.LoadFile).
  • For now, we can use it for console applications and class libraries.

Although it has limitations for now, I am sure it will reach a good point in the future.

Let’s create a console application with the target framework .NET 7 to perform a quick test. Then, let’s import the simple piece of code that checks whether the given input is palindrome or not into the Program.cs file.

Now, in order to compile this application natively, we need to add a property like below to the application’s project file.

If we don’t want to add a property, we can pass it as a parameter in the publish command.

Then we will be able to compile the application natively by specifying a runtime identifier of our choice. For example, we can use “win-x64” identifiers for Windows environment and “linux-arm64” identifiers for Linux.

NOTE: If we compile the application on Ubuntu 20.04, it only works in this version or higher. In short, we need to pay attention to the Linux version we use to compile.


As we know, the easiest method we can use for in-memory caching in ASP.NET Core is to use IMemoryCache.

A new API for metric support has been added to IMemoryCache, which has been with us for a long time, with .NET 7. Now, with MemoryCacheStatistics, we will be able to access the estimated size of the cache, as well as the information on how the cache is used.

Accessing the metric information about the application’s in-memory cache and taking actions accordingly will be beneficial for the health of the application.

In order to access this metric information, we need to call the “GetCurrentStatistics()” method over IMemoryCache. In addition, in order to track these metrics, we can either access this information via the dotnet-counters tool by using the EventCounters API, or we can use the .NET metrics API and OpenTelemetry.

Before this process, when adding IMemoryCache to the service collection, we need to set the “TrackStatistics” parameter to “true”.

Central NuGet Package Management

Although it’s not a huge feature, I liked being able to manage versions of common NuGet packages used by multiple projects from a central location.

For this, we need to create a file named Directory.Packages.props in the root folder of the relevant solution and define the packages we want in it as follows.

Then, it will be sufficient to add the name of the relevant package as a reference in the project we want.

Required Members

On the C# 11 side, with the “required” keyword, parameter null checking feature has become very useful.

As we saw above, when we use the “required” keyword, it will become mandatory to set the “MyRequiredParam” parameter while initializing the related class.

Microsoft Orleans 7.0

I have always had a special interest in the actor-model and I follow the Orleans Project closely in this regard. I have done a few different articles and one presentation about Orleans before.

As part of .NET 7, great performance improvements have also been made on the Orleans side. Improvements and a new serialization have been introduced on the Immutability side. When I was working on Orleans a few years ago, it was already a great and performing tool. I have developed several different applications on it. I’m curious about its current state and performance.

EF7 JSON Columns & Bulk Operations

As we know, SQL Server’s JSON columns support has been with us for a long time. JSON columns support on the EF side is now brought with this release. Now, with LINQ, we will be able to perform queries and different operations for JSON aggregates on the SQL Server side.

For example, suppose we have a schema like the one below.

At this point, we want to keep the PriceDetails information as a JSON column. All we need to do for this is to call the “ToJson()” method during model configuration as follows.

The rest is up to our LINQ power.

On the bulk operations side, two new methods have been introduced, namely “ExecuteUpdateAsync” and “ExecuteDeleteAsync”. By using these methods, we will be able to perform bulk transactions with the LINQ we want.

Instead of using different EF extensions to perform such bulk operations, it was very sweet that they were brought into EF.

Rate-Limiting Middleware

Finally, I would like to talk about the “Rate-Limiting” middleware brought by ASP.NET Core.

As we know, it is actually an important issue that the APIs we have developed have rate-limiting. Because it ensures that the API we have developed is not overwhelm and its performance is not reduced, and that we can have a kind of security mechanism against attacks such as DoS. Of course, especially if we are developing public APIs.

It is a very simple middleware to use and comes with 4 different rate-limiting policies: “Fixed window”, “Sliding window”, “Token bucket” and “Concurrency”. We can also attach at the desired endpoint level.

For example, a “Fixed window” policy like the one above allows a maximum of “100” requests within “10” second windows.

Then we can make it active or passive at the level we want.

Let’s Wrap Up

Within the scope of this article, I tried to talk about some innovations that I liked at first glance. This release didn’t surprise me either, because there are some really cool innovations and performance improvements within each release. In short, a great job!

The improvements, especially on the CLR side, are a really great job. I am also very curious about where Native AOT will go. Of course, in addition to these, many different new features that I did not mention here were added and improvements were made. For example, there are many different innovations and improvements such as loop and reflection performance optimizations, the newly added Archive Tar API.

You may also like...

Leave a Reply

Your email address will not be published.