$$$
{{ $t($store.state.user.experience_value_in_dollars) }}
Senior
{{ $t($store.state.user.experience_search_name) }}
0
jobs
Data Scientist ( 8 years of relevant experience)
Mudita Shrivastava
,
London, United Kingdom
Experience
Other titles
Skills
I'm offering
♦ Expertise in the field of Analytics, Data Science, DWH, BI reporting for clients across banking, supply chain, manufacturing, energy and healthcare domains. ♦ Loves to convert data into actionable business insights and presents them in a business friendly and influential manner. Excellent understanding of business operations and analytics tools to help drive strategic decision making through data. ♦ Stakeholder management, excellent communication and presentation skills, helping clients formulate concise business problems.
Markets
United States
(Remote
only)
United Kingdom
France
(Remote
only)
Germany
(Remote
only)
Lithuania
(Remote
only)
Denmark
(Remote
only)
Norway
(Remote
only)
Sweden
(Remote
only)
Finland
(Remote
only)
Links for more
Once you have created a company account and a job, you can access the profiles links.
Language
English
Fluently
Ready for
Larger project
Ongoing relation / part-time
Full time contractor
Available
My experience
2019 - 2019
job
Data Scientist
BRITISH PETROLEUM.
• Worked in the COE team to study attrition, bonus distribution patterns with the objective of optimizing current structure, succession planning and other HR business problems.
• Extensive stakeholder interaction and interpreting the technical output to non-tech audience.
Some Projects Accomplished-
Carbon Footprint calculation(Python, Geospatial)-
• Calculating total carbon footprint of a segment of employee population by distance travelled from home to office.
• Google API to calculate distance from one geospatial location to another and then estimating the CO2 emissions as per government statistics.
• This was a significant step towards environmental safety on behalf of the organization.
Bonus distribution trends and optimization(Tableau, Power BI, R)-
• Business objective - To study current bonus distribution pattern and how much is it related to performance and other factors.
• A deep-rooted analysis of bonuses distributed in the last 5 years including correlation of bonus & performance factor, linear regression to understand how much does bonus amount depends on various factors was done.
• Analysis results gave surprising insights which helped the stakeholders in redesigning their strategies.
• Extensive stakeholder interaction and interpreting the technical output to non-tech audience.
Some Projects Accomplished-
Carbon Footprint calculation(Python, Geospatial)-
• Calculating total carbon footprint of a segment of employee population by distance travelled from home to office.
• Google API to calculate distance from one geospatial location to another and then estimating the CO2 emissions as per government statistics.
• This was a significant step towards environmental safety on behalf of the organization.
Bonus distribution trends and optimization(Tableau, Power BI, R)-
• Business objective - To study current bonus distribution pattern and how much is it related to performance and other factors.
• A deep-rooted analysis of bonuses distributed in the last 5 years including correlation of bonus & performance factor, linear regression to understand how much does bonus amount depends on various factors was done.
• Analysis results gave surprising insights which helped the stakeholders in redesigning their strategies.
Python, Power BI, R, Tableau, Statistics
2016 - 2018
job
Data Analytics Team Lead
ACCENTURE.
• Being a part of core Data Analytics team, have won a lot of business deals for Accenture. This involved analyzing client's data, identifying the business problems, data mining on structured/unstructured data, data cleansing, visualization and delivering business insights.
• Have provided automation recommendations and cost/effort optimization solutions based on data analysis, patterns and statistical analysis.
• Tools used - Tableau for data exploration & Visualization, R for modelling and predictions, Python for pattern analysis.
• Coached the team members on advanced analytics concepts.
Projects Accomplished-
Housing data analysis (Big Data, Cloudera, PySpark)-
• Analysis of all property transactions in the last 15 years in a particular area to understand the trend of sale, prices and other factors.
• Data was too huge to deal outside big data environment, hence was analyzed on Cloudera server.
Text Analytics (Python- NLP)-
• Web-scraped reviews and did sentiment analysis on them to find the image of a product.
Sales Forecast Tool(R, R Shiny)-
• Business objective - To forecast sales for next three months based on historical data.
• Time series forecasting models - ARIMA, Auto-ARIMA, Holt Winter's were embedded at backend of an RShiny UI to present forecasted values.
• Machine Learning models were executed in R. Tableau was used to explore patterns and to gain more understanding of data.
Supply Chain Analytics (Optimization)-
• The business problem was to create an optimized transport network for a shipping organization with objective of reducing cost and maximizing fuel efficiency.
• Cplex solver was used on GAMS IDE to solve the optimization problem.
IT Incident data Analytics (Tableau, Splunk)-
• Client's Service Now logged incidents were analyzed to find exact issues in the applications in order to minimize the number of incidents logged per month.
• A Splunk based tool was developed to analyze incident data automatically, generate inferences, and create word cloud from text description that contained the reason of logging incident. Further, probable automation solutions were recommended to deflect/minimize logged incidents.
• A huge business audience was trained on using the Splunk based tool so that they get automatic reports generated.
• Splunk was used for finding patterns of text in huge unstructured texts in order to extract exact issues.
• Extensive Tableau dashboards were created to identify anomalies, patterns and trends of logging incidents by different features.
HR/People Analytics (R, R Shiny, Tableau)-
• Successfully developed an employee attrition prediction tool based on HR data.
• Logistic regression, decision tree, random forest and other machine learning models were built to predict if a given employee will leave next based on given factors of past one year data.
• R Shiny based UI was created for end user for high level view of data.
• Have provided automation recommendations and cost/effort optimization solutions based on data analysis, patterns and statistical analysis.
• Tools used - Tableau for data exploration & Visualization, R for modelling and predictions, Python for pattern analysis.
• Coached the team members on advanced analytics concepts.
Projects Accomplished-
Housing data analysis (Big Data, Cloudera, PySpark)-
• Analysis of all property transactions in the last 15 years in a particular area to understand the trend of sale, prices and other factors.
• Data was too huge to deal outside big data environment, hence was analyzed on Cloudera server.
Text Analytics (Python- NLP)-
• Web-scraped reviews and did sentiment analysis on them to find the image of a product.
Sales Forecast Tool(R, R Shiny)-
• Business objective - To forecast sales for next three months based on historical data.
• Time series forecasting models - ARIMA, Auto-ARIMA, Holt Winter's were embedded at backend of an RShiny UI to present forecasted values.
• Machine Learning models were executed in R. Tableau was used to explore patterns and to gain more understanding of data.
Supply Chain Analytics (Optimization)-
• The business problem was to create an optimized transport network for a shipping organization with objective of reducing cost and maximizing fuel efficiency.
• Cplex solver was used on GAMS IDE to solve the optimization problem.
IT Incident data Analytics (Tableau, Splunk)-
• Client's Service Now logged incidents were analyzed to find exact issues in the applications in order to minimize the number of incidents logged per month.
• A Splunk based tool was developed to analyze incident data automatically, generate inferences, and create word cloud from text description that contained the reason of logging incident. Further, probable automation solutions were recommended to deflect/minimize logged incidents.
• A huge business audience was trained on using the Splunk based tool so that they get automatic reports generated.
• Splunk was used for finding patterns of text in huge unstructured texts in order to extract exact issues.
• Extensive Tableau dashboards were created to identify anomalies, patterns and trends of logging incidents by different features.
HR/People Analytics (R, R Shiny, Tableau)-
• Successfully developed an employee attrition prediction tool based on HR data.
• Logistic regression, decision tree, random forest and other machine learning models were built to predict if a given employee will leave next based on given factors of past one year data.
• R Shiny based UI was created for end user for high level view of data.
Forecast, People Analytics, Visualization, Advanced Analytics, NLP, Shipping, It, Embedded, Network, Service, Python, Analytics, Word, Tableau, Cloud, R, Automation, UI, Forecasting, Data mining, Big Data, Machine learning, Backend, Data Analysis
2012 - 2016
job
DWH professional and ETL Developer
L&T INFOTECH.
Have contributed in a big scale data transformation environment that involved Data Staging from various sources, reformatting, feature engineering, processing and loading it on to downstream system.
• I have solely redesigned an end to end process which involved implementing on a new platform, testing it and productionizing it. This task was done in stringent timelines and was highly appreciated.
• This project involved understanding finances of an investment bank, capital market concepts, profit and loss statements and data revolved around equity and derivatives.
• Huge amount of data was hosted on servers in partitioned UNIX systems. Data was converted occasionally into serial format to process it and later converted to partitioned format to store it.
• It also involved data exploration and extraction using SQL queries in Oracle SQL Developer environment.
• I have solely redesigned an end to end process which involved implementing on a new platform, testing it and productionizing it. This task was done in stringent timelines and was highly appreciated.
• This project involved understanding finances of an investment bank, capital market concepts, profit and loss statements and data revolved around equity and derivatives.
• Huge amount of data was hosted on servers in partitioned UNIX systems. Data was converted occasionally into serial format to process it and later converted to partitioned format to store it.
• It also involved data exploration and extraction using SQL queries in Oracle SQL Developer environment.
Sql, Oracle, Unix, ETL, Engineering, It, Oracle sql, Developer, Transformation, Testing, Feature, Processing
My education
2017
-
2018
Indian School of Business
Executive program, Advanced Analytics
Executive program, Advanced Analytics
2007
-
2011
University of Mumbai
Bachelors of Engineering, Information Technology
Bachelors of Engineering, Information Technology
Mudita's reviews
Mudita has not received any reviews on Worksome.
Contact Mudita Shrivastava
Worksome removes the expensive intermediaries and gives you direct contact with relevant talent.
Create a login and get the opportunity to write to Mudita directly in Worksome.
38100+ qualified freelancers
are ready to help you
Tell us what you need help with
and get specific bids from skilled talent in Denmark