Image for post
Image for post
Source: neptune.ai

This article was originally written by Gopal Singh and posted on the Neptune blog.

“Artificial intelligence will reach human levels by around 2029. Follow that out further to, say, 2045, and we will have multiplied the intelligence — the human biological machine intelligence of our civilization — a billion-fold.”

– Ray Kurzweil, American inventor and futurist.

As we know, “Data” is the new power, and companies around the globe are trying to leverage this power in their businesses. Whether that business is:

  • Healthcare: As the biomedical data is increasing, AI can provide a wide variety of services to aid humans. …

Image for post
Image for post
Source: neptune.ai

This article was originally written by Jakub Czakon and posted on the Neptune blog.

Training machine learning or deep learning models can take a really long time.

If you are like me, you like to know what is happening during that time:

  • want to monitor your training and validation losses,
  • take a look at the GPU consumption,
  • see image predictions after every other epoch
  • and a bunch of other things.

Neptune lets you do all that, and in this post I will show you how to make it happen. Step by step.

Check out this example run monitoring experiment to see how this can look like. …


Image for post
Image for post
Source: neptune.ai

This article was originally written by Shahul Es and posted on the Neptune blog.

Choosing the correct hyperparameters for machine learning or deep learning models is one of the best ways to extract the last juice out of your models. In this article, I will show you some of the best ways to do hyperparameter tuning that are available today (in 2020).

What is the difference between parameter and hyperparameter?

First, let’s understand the differences between a hyperparameter and a parameter in machine learning.

  • Model parameters: These are the parameters that are estimated by the model from the given data. …


Image for post
Image for post
Source: neptune.ai

This article was originally written by Raúl Gómez Bruballa and posted on the Neptune blog.

Image retrieval is the task of finding images related to a given query. With content-based image retrieval, we refer to the task of finding images containing some attributes which are not in the image metadata, but present in its visual content.

In this post we:

  • explain the theoretical concepts behind content-based image retrieval,
  • show step by step how to build a content-based image retrieval system with PyTorch, addressing a specific application: finding face images with a set of given face attributes (i.e. male, blond, smiling).
Image for post
Image for post

Concepts explained that might be of interest:

Ranking Loss, Contrastive Loss, Siamese Nets, Triplet Nets, Triplet Loss, Image…


Image for post
Image for post
Source: neptune.ai

This article was originally written by Derrick Mwiti and posted on the Neptune blog.

Imagine if you could get all the tips and tricks you need to tackle a binary classification problem on Kaggle or anywhere else. I have gone over 10 Kaggle competitions including:

– and pulled out that information for you. …


Image for post
Image for post
Source: neptune.ai

This article was originally written by Kamil Kaczmarek and posted on the Neptune blog.

Last week I had the pleasure to participate in the International Conference on Learning Representations (ICLR), an event dedicated to the research on all aspects of deep learning. Initially, the conference was supposed to take place in Addis Ababa, Ethiopia, however, due to the novel coronavirus pandemic, it went virtual. I’m sure it was a challenge for organisers to move the event online, but I think the effect was more than satisfactory, as you can read here!

Over 1300 speakers and 5600 attendees proved that the virtual format was more accessible for the public, but at the same time, the conference remained interactive and engaging. From many interesting presentations, I decided to choose 16, which are influential and thought-provoking. Here are the best deep learning papers from the ICLR. …


Image for post
Image for post
Source: neptune.ai

This article was originally written by Jakub Czakon and posted on the Neptune blog.

Training machine learning/deep learning models can take a really long time, and understanding what is happening as your model is training is absolutely crucial.

Typically you can monitor:

  • Metrics and losses
  • Hardware resource consumption
  • Errors, Warnings, and other logs kept (stderr and stdout)

Depending on the library or framework, this can be easier or more difficult, but pretty much always it is doable.

Most libraries allow you to monitor your model training in one of the following ways:

  • You can add a monitoring function at the end of the training…


Image for post
Image for post
Source: neptune.ai

This article was originally written by Jakub Czakon and posted on the Neptune blog.

Some time ago I had a chance to interview a great artificial intelligence researcher and Chief AI Scientist in Lindera, Arash Azhand.

We talked about:

  • The AI technology behind his work at Lindera
  • His career path
  • How it is to be a research-centered scientist
  • How to become a good leader
  • Why it is important to approach AI research from a business perspective

Here is the full video of our conversation:

You can watch the video to see the raw content.

Here, on the other hand, is a summary of our conversation. What follows is not a 1–1 transcript but rather a cleaned-up, structured, and rephrased version of it. …


Image for post
Image for post
Source: neptune.ai

This article was originally written by Kamil Kaczmarek and posted on the Neptune blog.

The International Conference on Learning Representations (ICLR) took place last week, and I had a pleasure to participate in it. ICLR is an event dedicated to research on all aspects of representation learning, commonly known as deep learning. Due to the coronavirus pandemic, the conference couldn’t take place in Addis Ababa, as planned, and went virtual instead. It didn’t change the great atmosphere of the event, quite the opposite — it was engaging and interactive, and attracted an even bigger audience than last year. …


Image for post
Image for post
Source: neptune.ai

This article was originally written by Kamil Kaczmarek and posted on the Neptune blog.

Machine learning algorithms are tunable by multiple gauges called hyperparameters. Recent deep learning models are tunable by tens of hyperparameters, that together with data augmentation parameters and training procedure parameters create quite complex space. In the reinforcement learning domain, you should also count environment params.

Data scientists should control hyperparameter space well in order to make progress.

Here, we will show you recent practices, tips & tricks, and tools to track hyperparameters efficiently and with minimal overhead. …

About

Patrycja Jenkner

Marketing Assistant at @neptune_ai

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store