$$$
{{ $t($store.state.user.experience_value_in_dollars) }}
Senior
{{ $t($store.state.user.experience_search_name) }}
0
jobs
Machine Learning and AI expert
Debonil Chowdhury
,
Epsom, United Kingdom
Experience
Other titles
Skills
I'm offering
I have around 15 years of experience in building scalable machine learning models and deploying them in various cloud infrastructures .
My expertise are in demand forecasting , supply chain optimisation , Anomaly detection algorithms , Segmentations , churn predictions and many others.
I have worked with Dunnhumby for 10 years and over last 2 years I have been freelancing. My current clients are Unilever , Clark’s , Superdry, DHL, UK Department for Work and Pensions and many others.
I have advanced knowledge of python , spark , R and also Azure end to end .
My expertise are in demand forecasting , supply chain optimisation , Anomaly detection algorithms , Segmentations , churn predictions and many others.
I have worked with Dunnhumby for 10 years and over last 2 years I have been freelancing. My current clients are Unilever , Clark’s , Superdry, DHL, UK Department for Work and Pensions and many others.
I have advanced knowledge of python , spark , R and also Azure end to end .
Markets
United Kingdom
Links for more
Once you have created a company account and a job, you can access the profiles links.
Industries
Language
English
Fluently
Ready for
Larger project
Ongoing relation / part-time
Full time contractor
Available
My experience
2018 - ?
temp
DATA SCIENTIST
Client (Unilever).
◦ Developed machine learning model for demand forecasting for Unilever end to end in Azure databricks platform.
◦ Developed an algorithm to prediction future dispatch rate issues for Unilever supply chain, this model also automatically identifies the drivers of service quality disruption and recommends actions.
◦ I designed the entire architecture in Azure using data factory and azure data bricks. Now this solution is one of the key global supply chain optimization models for Unilever.
◦ Developed segmentation approach to demand profile, where a products previous demand trends are clustered as time series clusters and based on the characteristics of the time series the products are clustered. Once the products are clustered, then in each cluster separate machine learning model is built to handle the particular time series characteristics.
◦ All solutions have been built in Azure data bricks platform with end to end automated pipeline.
• Client (Clarks)
◦ Developed an algorithm that predicts the shoe size level distribution of demand by region in UK, US & EU.
◦ This model looks at several features for a particular shoe in history and tries to predict the correct assortment of different sizes and fits by region.
◦ This solution is also an automated solution that runs on the server every Sunday night once the previous week historical data is updated.
◦ This then runs the model and prepares a scorecard that gets pushed into one of the SAP tables which then gets consumed by the procurement team through a power BI tool.
• Client (Thought Provoking Consulting - Superdry)
◦ Developed an algorithm that estimates elasticity of demand and generates predictions on markdown discounts.
◦ This solution has also been automated end to end, such that the client can run this on any client and in any market provided the input data structures are same.
◦ This solution is then hosted in the client's inhouse server.
• Client (The Mix London)
◦ The Mix London is a market research company who does various market research across the globe for shoe brands.
◦ I developed an automated clustering algorithm that looks at the responses of the questionnaires and then clusters them based on the propensity of response in different questions.
◦ First a Factor Analytical set up is generated where latent factors are identified and then using a Gaussian Mixture model to extract the features.
• Other Clients (Including DHL, Delhaize in US)
◦ Machine learning models for
* Churn prediction
* Promotional Propensity Modelling
* Personalization algorithm for recommender system.
* Campaign performance evaluation automated systems.
* Demand Forecasting
◦ Developed an algorithm to prediction future dispatch rate issues for Unilever supply chain, this model also automatically identifies the drivers of service quality disruption and recommends actions.
◦ I designed the entire architecture in Azure using data factory and azure data bricks. Now this solution is one of the key global supply chain optimization models for Unilever.
◦ Developed segmentation approach to demand profile, where a products previous demand trends are clustered as time series clusters and based on the characteristics of the time series the products are clustered. Once the products are clustered, then in each cluster separate machine learning model is built to handle the particular time series characteristics.
◦ All solutions have been built in Azure data bricks platform with end to end automated pipeline.
• Client (Clarks)
◦ Developed an algorithm that predicts the shoe size level distribution of demand by region in UK, US & EU.
◦ This model looks at several features for a particular shoe in history and tries to predict the correct assortment of different sizes and fits by region.
◦ This solution is also an automated solution that runs on the server every Sunday night once the previous week historical data is updated.
◦ This then runs the model and prepares a scorecard that gets pushed into one of the SAP tables which then gets consumed by the procurement team through a power BI tool.
• Client (Thought Provoking Consulting - Superdry)
◦ Developed an algorithm that estimates elasticity of demand and generates predictions on markdown discounts.
◦ This solution has also been automated end to end, such that the client can run this on any client and in any market provided the input data structures are same.
◦ This solution is then hosted in the client's inhouse server.
• Client (The Mix London)
◦ The Mix London is a market research company who does various market research across the globe for shoe brands.
◦ I developed an automated clustering algorithm that looks at the responses of the questionnaires and then clusters them based on the propensity of response in different questions.
◦ First a Factor Analytical set up is generated where latent factors are identified and then using a Gaussian Mixture model to extract the features.
• Other Clients (Including DHL, Delhaize in US)
◦ Machine learning models for
* Churn prediction
* Promotional Propensity Modelling
* Personalization algorithm for recommender system.
* Campaign performance evaluation automated systems.
* Demand Forecasting
Machine learning, Market research, Forecasting, Research, Azure, SAP, Power BI, Procurement, Consulting, Service, Architecture, BEE, Server, Campaign, Power, UP
2017 - 2018
job
HEAD OF DATA SCIENCE
unknown.
• Leading a team of 5 Senior and Junior Data scientist to enable Avon obtain incremental revenue through better understanding its customers using Data as the basis of key decision making process.
• Developed a data centric strategy forming part of the Global priorities for Avon.
• Examples of few projects are:
◦ Understanding Avon representative behavior
* Developed and implemented a model to predict inactivity of Avon representative using a mixture model of autoregressive features fitted using extreme gradient boosting.
* Developed the concept of predictive loyalty through this model together with the lifetime value.
* Currently I am embedding this loyalty segmentation as key KPI in the business to track performance.
* Developed Algorithm to generate list of offers for the representatives at risk.
* Implemented automated process to evaluate such marketing activities (A/B testing)
* This approach of loyalty and inactivity prediction has been implemented in 52 markets in just 3 weeks across the world using automated hyper parameter tuning and feature selection.
◦ Demand Forecasting
* Developing algorithm using an ensemble approach of xgboost and log log elasticity models to predict the demand of an item, once the price, promotion, offer, positions are decided.
* Currently I am building a simulator on top of the model to enable the marketing planner to assess the effect of their decisions in real time.
* This project is estimated to save $80 million over next year,
◦ Strategic projects
* Developed long term and short-term strategic vision for Data Science at Avon.
* Based on our POCs and discussion with markets, I presented the budget proposal for 2018 which got approved by the executive board members.
* Working to define consistent data platforms (Azure, AWS) in cloud across all markets to enable
• Strategic projects to be rolled out to multiple markets at scale.
• Consistent treatment and interpretation of data across Avon to build consistent language and data culture across the business.
* All projects proposed are through open source software, with the cost implications only on the cloud hosting.
• Developed a data centric strategy forming part of the Global priorities for Avon.
• Examples of few projects are:
◦ Understanding Avon representative behavior
* Developed and implemented a model to predict inactivity of Avon representative using a mixture model of autoregressive features fitted using extreme gradient boosting.
* Developed the concept of predictive loyalty through this model together with the lifetime value.
* Currently I am embedding this loyalty segmentation as key KPI in the business to track performance.
* Developed Algorithm to generate list of offers for the representatives at risk.
* Implemented automated process to evaluate such marketing activities (A/B testing)
* This approach of loyalty and inactivity prediction has been implemented in 52 markets in just 3 weeks across the world using automated hyper parameter tuning and feature selection.
◦ Demand Forecasting
* Developing algorithm using an ensemble approach of xgboost and log log elasticity models to predict the demand of an item, once the price, promotion, offer, positions are decided.
* Currently I am building a simulator on top of the model to enable the marketing planner to assess the effect of their decisions in real time.
* This project is estimated to save $80 million over next year,
◦ Strategic projects
* Developed long term and short-term strategic vision for Data Science at Avon.
* Based on our POCs and discussion with markets, I presented the budget proposal for 2018 which got approved by the executive board members.
* Working to define consistent data platforms (Azure, AWS) in cloud across all markets to enable
• Strategic projects to be rolled out to multiple markets at scale.
• Consistent treatment and interpretation of data across Avon to build consistent language and data culture across the business.
* All projects proposed are through open source software, with the cost implications only on the cloud hosting.
Marketing, Data Science, AWS, Budget, Forecasting, Azure, Cloud, Open source, Hosting, Testing, KPI, Science, Software, Feature, Tuning
2016 - 2016
job
HEAD OF DATA SCIENCE
CUSTOMER STRATEGY VERTICAL.
• I used to head up data science unit for Customer Strategy vertical within dunnhumby. Here I was accountable for producing cutting edge data science solutions for products and services that involves answering client questions regarding customer behavior (attitudinal as well as behavioral). This function is in the heart of all dunnhumby solutions.
• In this role, I have worked on developing an algorithm using hierarchical random forest model to identify root cause of trade performance issues. Currently I am implementing this algorithm in a Hadoop environment using python which would be possible to install even within a client premises.
• Few other projects are: Latest dirichlet's allocation algorithm on sentiment analysis, Gaussian Mixture models in creating correlated customer segments, Lasso Regression on a log log model to decompose sales change into its component of various retailing decisions.
• Currently working on a live project where I am trying to use Agent based Models as well as reinforcement learning to come up with a solution proposal for Marketing Mix modelling.
• Apart from these there are few other projects that my direct reports are working on, which I am supervising.
• In this role, I have worked on developing an algorithm using hierarchical random forest model to identify root cause of trade performance issues. Currently I am implementing this algorithm in a Hadoop environment using python which would be possible to install even within a client premises.
• Few other projects are: Latest dirichlet's allocation algorithm on sentiment analysis, Gaussian Mixture models in creating correlated customer segments, Lasso Regression on a log log model to decompose sales change into its component of various retailing decisions.
• Currently working on a live project where I am trying to use Agent based Models as well as reinforcement learning to come up with a solution proposal for Marketing Mix modelling.
• Apart from these there are few other projects that my direct reports are working on, which I am supervising.
Marketing, Python, Data Science, Hadoop, Sales, Science, Agent, UP
2006 - 2008
job
ANALYST
GENPACT.
I worked in the Actuarial Model development and pricing team. My responsibilities were building loss models and pricing models for SwissRe Company. Applied Statistical Modelling techniques with various software like SAS,Matlab, VBA, Oracle Data Miner, Enterprise Miner
Oracle, Matlab, VBA, SAS, Analyst, Development, Software, Enterprise
My education
Indian Institutes of Technology - Kanpur
N/a, Statistical
N/a, Statistical
Debonil's reviews
Debonil has not received any reviews on Worksome.
Contact Debonil Chowdhury
Worksome removes the expensive intermediaries and gives you direct contact with relevant talent.
Create a login and get the opportunity to write to Debonil directly in Worksome.
38100+ qualified freelancers
are ready to help you
Tell us what you need help with
and get specific bids from skilled talent in Denmark