Home > Candidates > Mrunmaya Dash
Mrunmaya Dash

Kolkata West Bengal, India

Phone: xxx-xxx-xxxx

Email: xxx@xxxx.xxx



  • Looking For: Hadoop Developer, BigData and Hadoop Developer

  • Occupation: IT and Math

  • Degree: Bachelor's Degree

  • Career Level: Experienced

  • Languages: English

Career Information:

Sign up to view Mrunmaya Dash's full profile.

Highlights:• Result-oriented professional with 11.1 years of experience in Datawarehouse Architecture, Application Design, Project Management, Liaison & Coordination and Team Management • Expertise in loading and transforming large datasets of structured, semi structured and unstructured data into Hadoop ecosystem • Skilled in datawarehouse architecture and designing Star Schema, Snow Flake Schema, Fact & Dimensional Tables and Physical & Logical Data Modeling • Proficient in architecting real time streaming applications and batch style large scale distributed computing applications using tools like Spark Streaming, Spark SQL, Hive, Pig, Scala, Impala, HBase and so on • Pivotal in hadoop architecture and its components like HDFS, Job Tracker, Task Tracker, Name Node and Data Node • Hands-on experience of importing and exporting data into HDFS and Hive using Sqoop • An enterprising leader with skills in leading personnel towards accomplishment of common goals • Efficiently undertook initiatives such as Successful PoC, Requirement Gathering to ensure that Hadoop development was in place • Successfully received the Top performer Award in recognition of successfully delivered and gained client satisfaction on MDM_GUID Project, leading the team in right direction for Optimus Project • Holds the distinction of implementing root cause analysis through measures such as risk measure for Horizon, sending data to third party vendor GDIT, which in turns saves millions of Dollars for Client benefit • Played a key role in developing UNIX scripts by taking initiatives in performance Improvement of long running daily cycle • Overcame Resource contention, SIT and UAT testing and requirement change during UAT testing challenges during the completion of Lumeris Extract project, ICD-10 project, MDM GUID by taking Resource reassign, stretched work measures to go extra miles to satisfy customer and meet timeline in delivery SKILL SET Datawarehouse Architecture Application Design & Development Technology Management Software Testing Project Management Client Relationship Management Liaison & Coordination Team Management WORK EXPERIENCE Since May’17 with TCS, Kolkata Presently designated as Hadoop Developer(Amgen) Growth Path: Jul’08 to Dec’12 Mainframe Developer and Unix Developer (Enterprise Datawarehouse) Jan’13 to Oct’15 Mainframe Developer and Unix/ Informatica Developer (Datawarehouse ICoE) Nov’15 to Apr’17 Hadoop Developer (Horizon Data Hub - HDH) May’17 to till date Hadoop Developer(Amgen)

Skills:Sqoop, Hive, Spark, Scala, Impala, HBase, HDFS, ETL, Unix Shell Script

Goal:Technical Consultant in Big Data and Hadoop Identify personal boundaries at work and know what you should do to make your day more productive and manageable. Communicate more effectively at work. Feel happier and more positive during your workday. Be more organized with daily goals. De-clutter your work space and keep yourself organized throughout the week. Become a mentor. Be known as an expert in a certain field or area. Manage clients better and more efficiently. Improve company profitability by a certain percentage. Delegate work and tasks more effectively to increase own productivity.

Certification:2016 Simplilearn Certified in Big Data and Hadoop Development

Honor:• Acknowledged for IBM top performer ranking in 2009, 2013, 2014 • Attained: o IBM Service Excellence Awards in 2009 and 2014 o IBM Manager’s Choice Award in 2015


Experiences:

Hadoop Developer 05/2017 - current
Tata Consultancy Services, Kolkata, West Bengal India
Industry: IT
•Result-oriented professional with 11.1 years of experience in Datawarehouse Architecture, Application Design, Project Management, Liaison & Coordination and Team Management •Expertise in loading and transforming large datasets of structured, semi structured and unstructured data into Hadoop ecosystem •Hands-on experience of importing and exporting data into HDFS and Hive using Sqoop •An enterprising leader with skills in leading personnel towards accomplishment of common goals
Role (Client: Amgen, California): As Hadoop Developer (Amgen EDL – Enterprise Data Lake) • Created technical specifications and design document based on the Functional specifications and Data Model • Building scalable distributed data solutions using Hadoop • Gathering requirements and participating in the Agile planning meetings in-order to finalize the scope of each development • Monitoring migration of ETL processes from Oracle/Teradata/SAP to Hive to test the easy data manipulation • Managed system administration with Unix system maintenance • Instituting importing and exporting of data from Oracle/Teradata/SAP into HDFS and HIVE using Sqoop • Troubleshooting issues in the execution of MapReduce jobs by inspecting and reviewing log files • Created data visibility to customer using Impala • • Optimizing Scala Jobs to use HDFS by using various compression mechanisms like Avro, Parquet format • Configuring periodic incremental imports of data from Oracle/Teradata/SAP into HDFS using Sqoop • Creating: o Complete processing engine, based on cloudera distribution, enhanced to performance o Partitions, Dynamic Partitions and Buckets for granularity and optimization using HiveQL • Developing Sqoop scripts in order to make the interaction between HDFS and RDBMS (Oracle, Teradata, SAP) • Performing Hive queries and running Pig scripts to study customer behaviour • Developed IV(Infrastructure Verification) and IQ(Infrastructure Quality) Script for System changes. • Identifying job dependencies to design workflow for Oozie and resource management for YARN Environment: Hadoop, Scala 1.6.0, HDFS, Hive, Spark 2.11, Hue, Impala, HBase, Cloudera Manager, ETL, Sqoop, Oozie, ZooKeeper, TeraData, SAP, Eclipse, Unix, Crontab--
Application Developer 07/2008 - 04/2017
IBM, Bangalore, Karnataka India
Industry: IT
• Skilled in datawarehouse architecture and designing Star Schema, Snow Flake Schema, Fact & Dimensional Tables and Physical & Logical Data Modeling • Proficient in architecting real time streaming applications and batch style large scale distributed computing applications using tools like Spark Streaming, Spark SQL, Hive, Pig, Scala, Impala, HBase and so on • Expertise in loading and transforming large datasets of structured, semi structured and unstructured data into Hadoop ecosystem
--
Software Developer 07/2006 - 07/2008
NIIT Technologies, Kolkata, West Bengal India
Industry: IT
Entry level Developer started my Carrier in Mainframe Technology.
Role (Client: ING Vysya, Netherlands): • Analysed the client requirements and converted them into technical specifications • Analysed, designed and coded client development using Cobol, DB2 in Mainframe platform • Monitored design creation and data model • Managed functional design reviews and lead technical design reviews • Developed Design documents for various components identified in the system Environment: Mainframe Cobol, JCL, DB2, Dumpmaster, Expeditor, Endevor, QMF, VSAM, Flat File, CICS, Rexx, Tracemaster--

Education:

BijuPattnaik Utkal University 04/2001 - 04/2005
Bhubaneswar, Odissa, India
Degree: Bachelor's Degree
Major:Computer Science
Bachelor's Degree in Computer science


Download Resume(Available to Employers Only):

Login to view resume: Hadoop Developer -



More About Mrunmaya Dash
Please sign in or sign up an employer to view Mrunmaya Dash's personal information.

  • Phone: xxx-xxx-xxxx
  • Email:xxx@xxxx.xxx
  • Visa: -
  • Work Authorization: -
  • Expected Salary: -
  • Intests & Hobbies: -