saravanan-resume

10
Saravanan Elumalai [email protected] +1(480) 321 – 7816 https://www.linkedin.com/in/saravanan-elumalai-463b3721 Resume Personal Information Name: Saravanan Elumalai Years of Experience: 8.5+ Years LinkedIn: https://www.linkedin.com/in/saravanan- elumalai-463b3721 Hortonworks Digital Badge: http://bcert.me/ssslzwcf Objective I am a Big Data Technology specialist with Hortonworks HDP Certified Developer in design, analyze, develop, deploy, build Hadoop applications and creating data virtualization & visualization layer to meet enterprise business needs; performing high-performance turning; and improving software functionality. My expertise lies in enabling business users to leverage customer, financial, and operational information to improve performance, productivity, and profitability. I am particularly adept at using Apache Hadoop Eco System. I am also a quick resolution finder, root cause analysis in the application/environment issues and fix them in time. As a quick learner with a passion for technology, I am able to rapidly assess business conditions and determine the most appropriate and cost-effective data solutions for particular situations. I work comfortably with multi-cultural groups and in diverse IT environments, including testing, production, and quality assurance, among others. I am also equally at home works as a team leader or as an individual contributor. In either case, I understand the importance of agile group with business goals, organizational objectives and deliverables on time. I consider myself a valued business partner and trusted advisor to executive leadership. I always enjoy learning new applications and methodologies that can help my customers. Current areas of expertise include, Apache Hadoop Distribution Page 1 of 10 Dated: 08/27/22

Upload: saravanan-elumalai

Post on 11-Apr-2017

35 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Saravanan-Resume

Saravanan [email protected] +1(480) 321 – 7816https://www.linkedin.com/in/saravanan-elumalai-463b3721

ResumePersonal Information

Name: Saravanan ElumalaiYears of Experience: 8.5+ YearsLinkedIn: https://www.linkedin.com/in/saravanan-elumalai-463b3721Hortonworks Digital Badge: http://bcert.me/ssslzwcf

ObjectiveI am a Big Data Technology specialist with Hortonworks HDP Certified Developer in design, analyze, develop, deploy, build Hadoop applications and creating data virtualization & visualization layer to meet enterprise business needs; performing high-performance turning; and improving software functionality. My expertise lies in enabling business users to leverage customer, financial, and operational information to improve performance, productivity, and profitability. I am particularly adept at using Apache Hadoop Eco System. I am also a quick resolution finder, root cause analysis in the application/environment issues and fix them in time.

As a quick learner with a passion for technology, I am able to rapidly assess business conditions and determine the most appropriate and cost-effective data solutions for particular situations. I work comfortably with multi-cultural groups and in diverse IT environments, including testing, production, and quality assurance, among others.

I am also equally at home works as a team leader or as an individual contributor. In either case, I understand the importance of agile group with business goals, organizational objectives and deliverables on time. I consider myself a valued business partner and trusted advisor to executive leadership.

I always enjoy learning new applications and methodologies that can help my customers. Current areas of expertise include, Apache Hadoop Distribution cluster setup, Hive, Sqoop, Pig, HBase, Flume, Phoenix, Hue, HDFS, YARN, TEZ, Oozie, ZooKeeper, Map-Reduce, Windows and Unix operating systems, NoSQL (Riak, Couchbase), MySQL, RabbitMQ, Node.JS, Continuous Integration using Jenkins, GitHub, Nexus and Rally, Data virtualization layer using Denodo, Data visualization layer using SAP BO (Business Objects) | Agile Development Software. Also showing interest in learning scikit-learn machine learning in python.

Page 1 of 8 Dated: 05/02/23

Page 2: Saravanan-Resume

Saravanan [email protected] +1(480) 321 – 7816https://www.linkedin.com/in/saravanan-elumalai-463b3721

Employment HistoryEmployer Name

Designation Duration

Infosys Limited (USA) Technology Lead Sep 2015 - Present

Cognizant Technology Solutions (India, US Corp) Associate Jan 2011 – Sep 2015

PS Technologies, Vellore, India Software Engineer Jul 2008 – Dec 2010

Education DetailsTitle of the Degree with Branch

College/University Year of Passing

Master of Science in Software Engineering (5 Years Integrated)

Vellore Institute of Technology, Vellore, India 2008

Technical Skills

Hardware / Platforms Linux, Aix, Solaris, Windows, Beagle Bone, Raspberry devices

Technology Open SourceProgramming Languages C/C++, Java, Python, Erlang.Server Side Programming Node.JS, Tronoda(Python), Gearman Job Server.

Hadoop Eco System Apache Hive, Pig, Sqoop, HDFS, HBase, Flume, Phoenix, Hue, Oozie.

Version Control GitHub, Nexus, Jenkins (Continuous Integration).Cloud Management AWS Service, Nginx.

Tools Used Agile Rally, Supervisord(Python), ZooKeeper (Distributed Configuration), Eclipse IDE.

Messaging Queue Rabbit MQ, MQTT.Protocol XMPP, AMQP, HTTP, SIP.Security Server Central Authentication Service (CAS) – SSO & OAUTH 2.0.Databases MySQL, NoSQL (Riak, Couchbase, Hadoop HBase).Scripting Languages JavaScript, Shell Scripting, Perl Scripting.Data Virtualization & Visualization Layer Denodo, SAP BO (Business Objects).

Awards and recognitionAbove & Beyond Awards: Feather on the Cap

Cognizant - Global Technology Office(GTO)October 2011

Winner and Runner Up of InsuranceNext Premier League (IPL)

InsuranceNext (Cognizant)November 2013

Page 2 of 8 Dated: 05/02/23

Page 3: Saravanan-Resume

Saravanan [email protected] +1(480) 321 – 7816https://www.linkedin.com/in/saravanan-elumalai-463b3721

AppreciationCognizant InsuranceNext Premier League 2013Cognizant & Client Appreciation 2014-2015

Project Detail

Organization: Infosys Limited

Client: T-Mobile USA, Inc.Technology: Hadoop Eco System (Apache HBase, Hive, Pig, Sqoop), Java, Agile Rally, Nexus, Jenkins, Denodo (Virtualization) & SAP BO (Visualization).Period: Sep 2015 – PresentRole: Technology Lead

Project ObjectiveT-Mobile’s IDW - Data Acquisition track project is part of the IDW (Integrated Data Warehouse) program at T-Mobile. The project will follow agile approach in order to enable business stakeholders to continuously provide feedback to the scrum team. It will be broken into two week “Sprints”.

Project DescriptionData Acquisition track project is part of the Integrated Data Warehouse program. The project intends to fulfill business logic and data transformation logic using pig scripts. Multiple source system has transaction related real-time and historical data, will be populated in Distributed Hadoop data lake in varies Hive tables. The Goal is to build Integrated Data Warehousing platform in order to generate consolidated report as per the business requirement.

Responsibilities Collaborate with the vendors and get all the data mappings that is required for

analysis and write logic to lookup reference table to populate data in target that helps the business to build reports on top of that data that is provided.

Create Hive DDL’s and write Apache Pig scripts with UDF’s to perform transformation logic as per the business requirement.

Create Apache Oozie workflow in order to schedule the job to perform and store the results in HDFS location.

Create Apache Sqoop logic to export data from HDFS using Hive tables to Teradata for consolidated report generation.

Create unit test cases to validate the results as per logic along with failure scenario, generate complete test report.

Store the developed source code in Github along with maven based artifacts xml to support continuous integration, libraries/binaries in Nexus.

Page 3 of 8 Dated: 05/02/23

Page 4: Saravanan-Resume

Saravanan [email protected] +1(480) 321 – 7816https://www.linkedin.com/in/saravanan-elumalai-463b3721

Prepare the continuous integration service using Jenkins for automatic build and packaging solution to promote jobs into QA/Prod environment.

Create data virtualization layer using denodo to create views based on Hadoop and Teradata views for visualization Business Objects (BO) reporting layer usage.

Involved in production job bug fixing and performance turning to improve job execution time faster.

Organization: Cognizant Technology Solutions

Client: CSAA Insurance Group, a AAA insurerTechnology: Perl, Derby DB, Windows Batch, IBM IC2M ToolPeriod: Feb 2014 – Sep 2015Role: Lead Developer/Application Developer III

Project ObjectiveClaims Admin System (CAS) Auto, Home owner, etc., document migration development Activities.

Project DescriptionClaims document archived in HOCS/HAL, AS400, and MAIG legacy systems. Retrieve the physical claim document from the legacy system into Fast Lane enterprises document management system using IBM Content Migration Tool. In the migration process retrieve claims document metadata from legacy owner and update in IBM Content Connections Module (IC2M) Staging DB to process document extraction. Store the extracted claims document in Fast Lane and metadata in CAS.

Responsibilities Business requirement analysis and converting them to the functional

requirements for the software development. Involved in the Software Development activities such as Design, Coding and

Testing for the enhancement projects and defect fixes. Involved in writing scripts to generate query for millions of document and to

process package check point validation. Coordinate with the offshore Cognizant team in the design, coding and testing

phases of the project execution. Attend meetings with the Client Manager and Cognizant Onsite Manager for

status and risks reporting the status during the project execution. Track and resolve defects identified in the existing application software and the

software under development. Involve in the timely delivery of all agreed project deliverables and their quality as

per Organization’s (ISO, SEI CMM Level 5) wherever applicable. Create and maintain the project documentation as per project Life Cycle and all

documents are version controlled and maintained for client review and audits. Automated scripts to ensure documents transferred without damaged using md5

checksum.

Page 4 of 8 Dated: 05/02/23

Page 5: Saravanan-Resume

Saravanan [email protected] +1(480) 321 – 7816https://www.linkedin.com/in/saravanan-elumalai-463b3721

Client: Global Technology Office (GTO) – IoT Innovation, CTSTechnology: Java, Node.JS, Python, Hadoop Eco System (Apache HBase, Hive, Pig, Sqoop,

Flume), NoSQL (Couchbase, redis), rabbitMQ, Tronoda, Security Service (CAS), Gearman Job Server

Devices: Beagle bone, temperature sensors, chronos accelerometer, RFID tags, etc.Period: Jan 2012 – Feb 2014Role: Lead Developer

Project ObjectiveInnovation project to host Internet of Things (IoT) Mediator Platform as Service.

Project DescriptionCognizant - Global technology office internal project to develop an innovative IoT mediation platform server, that any low power device can communicate any device to device through internet using this platform based on the workflow defined by the device/person.

Responsibilities Business requirement analysis and converting them to the functional

requirements for the software development Coded embedded programming using python to collect sensors data from the

devices and convert them in readable format, push the data to the mediator platform.

Created web server using Node.JS to handle more request as possible and validate the request data with corresponding schema stored in Couchbase, Upon success queued the data using rabbitMQ for background process.

Implemented gearman job server and clients to allow to do parallel work like serialize, deserialize the data.

Designed to store the sensors, event’s and predicted event data in the Hadoop Eco System.

Developed CEP (Complex Event Processing) to predict the event based on the sensor data using the designed rule engine.

Developed separate component to handle data storage, retrieval and logic to transform the data using Hadoop Eco System and started to analyze historical data to do predictive analysis.

Architecture and implemented real-time data streaming to support multi-request in the form of Web-api’s using redis pub-sub model and rabbitMQ.

Page 5 of 8 Dated: 05/02/23

Page 6: Saravanan-Resume

Saravanan [email protected] +1(480) 321 – 7816https://www.linkedin.com/in/saravanan-elumalai-463b3721

Developed deployment automation script to promote each required components to another environment using Python Fabric from the scratch and also maintained Amazon EC2 cluster.

Involved in learning and implementing oAuth 2.0 and Single Sign On(SSO) services into the mediation platform from the scratch.

Involved in monitoring team activity and coordinate with team to explain the concept, discussion.

Client: Global Technology Office(GTO), CTSTechnology: Erlang, XMPP Server/Client, NoSQL(riak), Python Period: May 2011 – Jan 2012Role: Application Developer

Project ObjectiveProof of Concept

Project DescriptionCognizant - Global technology office internal project to showcase a innovative concept which is implementable, used above technologies to achieve the concept of collaborative work in social platform.

Role and Responsibilities Designed and Implemented collaborative work in social platform using erlang

functional language and established chat using XMPP ejabberd. Involved in learning new technologies like erlang, python, ejabberd, etc. in the

given time frame and utilized the knowledge to develop the platform from the scratch.

Co-ordinate with external CoE’s to implement new features video chat and recoding the event capture activities.

Achieved fault-tolerance in case of application crash, the platform will still accept request and store them in system file. Once the application up and running automatically discover restore the state and reads new requests stored in the file system.

Demonstrated the proof of concept to head of the Department and achieved forming new CoE called Cognizant IoT Innovation CoE to implement the concept in large scale.

Client: PayPalTechnology: PayPal Internal application and C++ Period: Jan 2011 – Apr 2011Role: Application Developer

Page 6 of 8 Dated: 05/02/23

Page 7: Saravanan-Resume

Saravanan [email protected] +1(480) 321 – 7816https://www.linkedin.com/in/saravanan-elumalai-463b3721

Project ObjectiveProject to investigate PayPal website flexibility.

Project DescriptionPayPal analysis project to investigate there website flexibility for the users and collect the website access information then transform to third party survey to improve the website, via buffer.

Role and Responsibilities Requirement Analysis. Coding (involves policy rule coding, sequence flow, status flow, and dictionary

entries). Unit Testing

Organization: PS Technology, Vellore, Tamil Nadu, India.

Client: OXSEED Software GmbH, Bielefeld, German. Technology: C/C++, Bounce Checker, Microsoft SDK, Perl 5.8.7.Role: Team MemberPeriod: Jul 2008 – Dec 2010

Project ObjectiveEnterprise Content Management (ECM) document conversion from one format to another.

Project DescriptionThe core work is to provide intern technical and client need based customizations for the product AFP2web for “Oxseed Software GmbH”, German. This involves new feature development, bug fixes, product customization based on client’s need, product versioning. AFP2web product converts different documents into web understandable electronic document formats. It offers conversion support either as batch program or as a SDK. AFP2web can be customized

To convert documents in batches To convert documents ON THE FLY as per requirement To convert documents as Server upon request

AFP2web supports processing file formats like AFP (IBM Printer File Format), Line Data (IBM Line Printer File Format), PDF, TIFF, ASCII, JPEG, PNG etc. Also it takes care of archiving converted documents on different destinations like IBM MQ Server, Oracle database, EMC Archiver etc. Presently AFP2web is developed using C++, Java, Perl language and portable on various Operating Systems like Windows, Linux, Aix, Solaris etc.

Responsibilities Requirement Analysis.

Page 7 of 8 Dated: 05/02/23

Page 8: Saravanan-Resume

Saravanan [email protected] +1(480) 321 – 7816https://www.linkedin.com/in/saravanan-elumalai-463b3721

Involved in Maintenance and Enhancement Phase. Involved in source code integration and finding memory leaks in the source code. Involved in building AFP2web Products in binary and SDK on various OS. Involved in analyzing third party libraries like libTIFF, libGD, libJPEG, XPDF,

FreeType, etc. Involved in development of client needed customization perl modules. Unit Testing.

Client: OXSEED Software GmbH, Bielefeld, German. Technology: Java, Servlet, Perl, ActiveMQ, DirWatcher.Role: Team MemberPeriod: Oct 2009 – Dec 2010

Project ObjectiveEnterprise Content Management (ECM) document conversion from one format to another in Rest Service.

Project DescriptionThe OCTP application that converts Word documents and AFP Spools to single page tiff outputs with the use of ActiveMQ, DirWatcher, ServiceMix components.

The DirWatcher is a program that monitors file system changes at the specified directories and sends a message to the queue with relevant information. DirWatcher has an internal ActiveMQ which stores messages before sending it to external ActiveMQ.

The ActiveMQ is messaging provider software which acts bettween DirWatcher and ServiceMix. DirWatcher Internal ActiveMQ sends messages to External ActiveMQ which in turn are read by service mix components.

The ServiceMix is a Enterprise Service Bus (ESB) built upon Java Business Integration(JBI) specification. For each message, it calls services to perform the required conversion. ServiceMix changes name of the input directory at different stages of Word to TIFF or AFP to TIFF conversion.

This is portable for various Operating Systems like Windows, Linux, Aix. It can fetch the request from the client side and convert the spools in the server machine then response converted output to the client machine.

Page 8 of 8 Dated: 05/02/23