Skip to main content

ML.NET 1.0 RC Announced. What does it mean?

Microsoft has already announced a new the ML.NET 1.0 RC (Release Candidate) (version 1.0.0-preview). This is going to be the last version before the final ML.NET 1.0 in Q2 of 2019. It means soon we will be able to use machine learning with C# using a stable version of the library.
It seems that in the 95% of the functionality in RC is going to be a part of the stable 1.0 version. Nevertheless, some of the packages will remain in the review even after release.
These packages are:
  • TensorFlow components
  • Onnx components
  • TimeSeries components
  • Recommendadtions components
The full list of the preview packages here.

In this release (ML.NET 1.0 RC) Microsoft has initially concluded main API changes and for the next sprint, they are planning to focusing on documentation and samples improvements and also addressing major critical issues if needed.

I want to believe that their goal is to avoid any new breaking changes moving forward. And it all good news for those, who wanted to start use machine learning without changing their favorite language. Based on the list of changes now is a good time to start learning and preparing integrations.

Also in RC release were fixed problems with the use of TensorFlow models, that were introduced in version 0.11. You can check out additional release notes for 1.0 RC here.
You can find a list of other breaking changes here.

Get ready for ML.NET 1.0 before it releases! Next resources will be useful for going further:

Comments

  1. Great Blog! HASHCRON Technologies Software developers typically do the following: Analyze users' needs and then design and develop software to meet those needs. Recommend software upgrades for customers' existing programs and systems. Design each piece of an application or system and plan how the pieces will work together.

    ReplyDelete

Post a Comment

Popular posts from this blog

How to Build TypeScript App and Deploy it on GitHub Pages

Quick Summary In this post, I will show you how to easily build and deploy a simple TicksToDate time web app like this: https://zubialevich.github.io/ticks-to-datetime .

Pros and cons of different ways of storing Enum values in the database

Lately, I was experimenting with Dapper for the first time. During these experiments, I've found one interesting and unexpected behavior of Dapper for me. I've created a regular model with string and int fields, nothing special. But then I needed to add an enum field in the model. Nothing special here, right? Long story short, after editing my model and saving it to the database what did I found out? By default Dapper stores enums as integer values in the database (MySql in my case, can be different for other databases)! What? It was a surprise for me! (I was using ServiceStack OrmLite for years and this ORM by default set's enums to strings in database) Before I've always stored enum values as a string in my databases! After this story, I decided to analyze all pros and cons I can imagine of these two different ways of storing enums. Let's see if I will be able to find the best option here.

Caching strategies

One of the easiest and most popular ways to increase system performance is to use caching. When we introduce caching, we automatically duplicate our data. It's very important to keep your cache and data source in sync (more or less, depends on the requirements of your system) whenever changes occur in the system. In this article, we will go through the most common cache synchronization strategies, their advantages, and disadvantages, and also popular use cases.

How to maintain Rest API backward compatibility?

All minor changes in Rest API should be backward compatible. A service that is exposing its interface to internal or/and external clients should always be backward compatible between major releases. A release of a new API version is a very rare thing. Usually, a release of a new API version means some global breaking changes with a solid refactoring or change of business logic, models, classes and requests. In most of the cases, changes are not so drastic and should still work for existing clients that haven't yet implemented a new contract. So how to ensure that a Rest API doesn't break backward compatibility?