Click here to Skip to main content
15,880,392 members
Articles / General Programming / Performance
Article

Intel® Distribution of Modin Usage and Performance Tuning Guide

Rate me:
Please Sign up or sign in to vote.
5.00/5 (1 vote)
25 Jul 2022CPOL4 min read 4.5K   3  
This article serves to provide users a guide for best practices and advice when using Intel® Distribution of Modin.

This article is a sponsored article. Articles such as these are intended to provide you with information on products and services that we consider useful and of value to developers

To learn more about Intel® Distribution of Modin and how to get started, please visit the Intel® Distribution of Modin Getting Started Guide.

When To Use Intel® Distribution of Modin

The Intel® Distribution of Modin* is a performant, parallel, and distributed dataframe system that is designed around enabling data scientists to be more productive with the tools that they love in a single line code change with exclusive optimizations for Intel hardware. This library is fully compatible with the Pandas API.

Typically, when a developer is using default Pandas function calls to process and analyze their data, they experience performance and memory bottlenecks after the data scales to a larger size, which Pandas cannot typically handle.

Intel® Distribution of Modin aims to solve this problem. When developers hit this bottleneck, we recommend Intel® Distribution of Modin to continue to use Pandas API calls and also be able to speedily and infinitely scale their Pandas data frames – all with just a few lines of code.

For more information on Intel® Distribution of Modin, please visit the Intel® Distribution of Modin homepage.

Using Modin

Intel® Distribution of Modin is compatible with three different backend compute engines that are used by Modin to distribute and optimize Pandas API calls and computations:

  • Ray Backend (most recommended) – The Ray* backend is the recommended backend engine for Intel® Distribution of Modin. It has the most Pandas API functionality enabled as well as the most stable implementation with Intel® Distribution of Modin.
  • OmniSci Backend – In partnership OmniSci* (now Heavy.AI)*, Intel® Distribution of Modin supports the OmniSci as a backend, a very performant framework for end-to-end analytics that has been optimized to harness the computing power of existing and emerging Intel® hardware. Please note that this backend is currently experimental.
  • Dask Backend – The Dask* backend is recommended for workloads running on Windows operating systems, Intel® DevCloud for oneAPI.

To learn more about how to get started with using Modin, please reference the Intel® Distribution of Modin Getting Started Guide.

Performance Tuning with Intel® Distribution of Modin

How to Tune Modin Function Calls with Pandas

As we are providing performance benefits out-of-box on Pandas via Modin, there may be some limitations or yet-to-be covered support from certain optimizations on Modin. Modin and Pandas provides the ability to switch between either framework at the run-time if you were to experience performance setbacks. Some of the common reasons can be found here. Here is also some sample code that can be used as a workaround: 

It is recommended to convert Modin object to pandas object to efficiently use both frameworks. Sample code to move between Pandas and Modin object is given below. 

import ray
ray.init()
import modin.pandas as pd
df_log=pd.concat([self.df_log])
df_log.to_csv(os.path.join(savePath, logName + '_structured.csv'), index=False)

Now to processing the df_log on Pandas. Simply convert the df to pandas object using the object "_to_pandas()"

import pandas as pd
occ_dict = dict(df_log['EventTemplate']._to_pandas().value_counts())
df_event = pd.DataFrame()
df_event['EventTemplate'] = df_log['EventTemplate'].unique()

Controlling the Number of Cores

If you would like to control the number of cores that Intel® Distribution of Modin will utilize, versus the default of using all available cores on a device, then please visit the relevant documentation.

More Performance Tuning Information

For more information on performance tuning with Intel® Distribution of Modin, please visit the relevant open-source documentation.

Reasons Why Modin May Default Back to Pandas

If Intel® Distribution of Modin falls back to default Pandas functionality it is likely for one of the following reasons:

  • The function is already optimized by Pandas and using Intel® Distribution of Modin will not provide any more performance improvements at this time.
  • The method is not currently implemented by Intel® Distribution of Modin in the backend currently being used.

For more information, please visit the Modin documentation section: Defaulting to Pandas.

If the function is supposed to be implemented by Intel® Distribution of Modin backend engine that you are using according to the Supported APIs documentation, then please raise an issue on the Modin GitHub accordingly.

Using Default Pandas Implementation

Intel® Distribution of Modin is meant to effortlessly speed up Pandas workloads by distributing Pandas data and computation.

When the dataset size is very small, we recommend developers to use default Pandas import and calls first, instead of Intel® Distribution of Modin. This is because at this size, Intel® Distribution of Modin performance benefits are negligible since the Pandas package is targeted towards small dataset sizes.

At this data size, users may also see a slight slow-down when using Intel® Distribution of Modin on these smaller datasets when compared to default Pandas. This is because there is a required additional overhead for Modin® to distribute the data before calling Pandas functions. Pandas does not require this overhead, causing this discrepancy, which disappears as the dataset size continues to scale up.

About Modin Warnings

Please note, if you see a series of non-critical warnings from using Intel® Distribution of Modin, it does not mean that you are using the package incorrectly. This is the “verbose” log that is automatically generated when using Intel® Distribution of Modin and can be helpful with debugging problems. There will be an option to turn on and off most these warnings in a future release using “verbose” mode.

For More Information

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
United States United States
You may know us for our processors. But we do so much more. Intel invents at the boundaries of technology to make amazing experiences possible for business and society, and for every person on Earth.

Harnessing the capability of the cloud, the ubiquity of the Internet of Things, the latest advances in memory and programmable solutions, and the promise of always-on 5G connectivity, Intel is disrupting industries and solving global challenges. Leading on policy, diversity, inclusion, education and sustainability, we create value for our stockholders, customers and society.
This is a Organisation

42 members

Comments and Discussions