swapnilcv

2
Swapnil Soni Email : [email protected] Mobile No.: +1 937 610-6562 http://knoesis.wright.edu/researchers/swapnil OBJECTIVE Seeking to leverage my experiences and skills in a Software Engineering, to develop big data analytics and scalable applications. EDUCATION Master of Science in Computer Science Wright State University 2013-Present Bachelor of Engineering in Information Technology Rajiv Gandhi University 2005-2008 Polytechnic Diploma in Information Technology Rajiv Gandhi University 2002-2005 TECHNICAL SKILLS Language : Expert knowledge in core Java and J2EE frameworks (Servlet, Struts, Spring, JSF, IBatis, Hiber- nate and JPA) Big data technologies : Expert knowledge in Hadoop/HDFS, MapReduce, Apache Storm, AQL (Annotation Query Language), SystemT, and Beginner in Stanford parser and R Cloud: Intermediate-level knowledge in OpenStack, Amazon EC2, Amazon S3 NoSQL: Intermediate-level knowledge in MongoDB Machine learning: Some exposure in Machine learning techniques Web services : JAX-WS Web Service, Restful Web Service (REST) Web portals : Web Method Portal, ADF (Application Development Framework) Service Database :Intermediate-level knowledge in Oracle, MySql, and Beginner in Virtuoso Social media APIs : Twitter API, Facebook API Script : JavaScript, jQuery Others : HTML, JSON , XML, XSD, RDF Tools : JProfiler, Apache Tomcat, Apache HTTP Server, Weblogic10.3, Eclipse, Netbeans, Hudson, Apache Maven WORK EXPERIENCE Kno.e.sis Center, Dayton OH (Jan 2013 - present) (Graduate Research Assistant) PREDOSE: Worked on Text Analytics development from unstructured web forum discussions, using IBM BigInsights, SystemT, AQL, and Apache Hadoop. Twitris: Implemented an analysis pipeline engine for streaming data (i.e., tweets) using Twitter Stream- ing API, Apache Storm, and MongoDB. Also designed and developed an application to extract Top 10 popular URLs from tweets. Twitris-C: Worked on user engagement and social network analysis from social media content (i.e., tweets) using Java and MongoDB. Material Science: Designed and implemented a centralized web application for integrating various material science resources, using the J2EE Framework. DERI, Galway, Ireland (Dec 2013 - May 2014) (Intern)

Upload: swapnil-soni

Post on 09-Aug-2015

23 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: swapnilcv

Swapnil SoniEmail : [email protected] No.: +1 937 610-6562http://knoesis.wright.edu/researchers/swapnil

OBJECTIVE

Seeking to leverage my experiences and skills in a Software Engineering, to develop big data analytics andscalable applications.

EDUCATION

Master of Science in Computer Science Wright State University 2013-PresentBachelor of Engineering in Information Technology Rajiv Gandhi University 2005-2008Polytechnic Diploma in Information Technology Rajiv Gandhi University 2002-2005

TECHNICAL SKILLS

• Language : Expert knowledge in core Java and J2EE frameworks (Servlet, Struts, Spring, JSF, IBatis, Hiber-nate and JPA)

• Big data technologies :Expert knowledge in Hadoop/HDFS, MapReduce, Apache Storm, AQL (Annotation Query Language),SystemT, and Beginner in Stanford parser and RCloud: Intermediate-level knowledge in OpenStack, Amazon EC2, Amazon S3NoSQL: Intermediate-level knowledge in MongoDB

• Machine learning: Some exposure in Machine learning techniques

• Web services : JAX-WS Web Service, Restful Web Service (REST)

• Web portals : Web Method Portal, ADF (Application Development Framework) Service

• Database :Intermediate-level knowledge in Oracle, MySql, and Beginner in Virtuoso

• Social media APIs : Twitter API, Facebook API

• Script : JavaScript, jQuery

• Others : HTML, JSON , XML, XSD, RDF

• Tools : JProfiler, Apache Tomcat, Apache HTTP Server, Weblogic10.3, Eclipse, Netbeans, Hudson, ApacheMaven

WORK EXPERIENCE

• Kno.e.sis Center, Dayton OH (Jan 2013 - present)(Graduate Research Assistant)

◦ PREDOSE: Worked on Text Analytics development from unstructured web forum discussions, usingIBM BigInsights, SystemT, AQL, and Apache Hadoop.

◦ Twitris: Implemented an analysis pipeline engine for streaming data (i.e., tweets) using Twitter Stream-ing API, Apache Storm, and MongoDB. Also designed and developed an application to extract Top 10popular URLs from tweets.

◦ Twitris-C: Worked on user engagement and social network analysis from social media content (i.e.,tweets) using Java and MongoDB.

◦ Material Science: Designed and implemented a centralized web application for integrating variousmaterial science resources, using the J2EE Framework.

• DERI, Galway, Ireland (Dec 2013 - May 2014)(Intern)

Page 2: swapnilcv

◦ Designed and developed a Finite Element (FE) simulation application (Sifem) using Java Server Faces(JSF), which aims to improve reproducibility and automation of FE simulations, using RDF (Resourcedevelopment framework), OWL for knowledge representation.

• Oracle Financial Service Software Ltd (OFSS), Mumbai (Apr 2012 - Dec 2012)(Staff Consultant)

◦ Worked on developing a Service-oriented architecture (SOA) for an Internet banking application usingthe Application Development Framework (ADF) and Java Web Services.

• Tata Consultancy Services (TCS), Mumbai (Sep 2008 – Apr-2012)(System Engineer)

◦ Worked as a Java, J2EE, Java Web Service and, Web Method portal developer. Specifically contributedin designing, developing and testing various Web and SOA-based applications built for Insurance Ser-vices.

◦ On-site in Sydney, Australia (Feb 2011 - June 2011): During this time worked directly with clients andoffshore teams on developing a batch cancellation module between a Bank and an Insurance Company.

PUBLICATION

• Towards a Semantic Web Platform for Finite Element Simulations. 11th European Semantic Web Conference(ESWC), Springer LNCS Notes, Crete, Greece,2014Andre Freita, Kartik Asooja, Swapnil Soni, Marggie Jones , Panagiotis Hasapis, Ratnesh Sahay

MASTER’S THESIS

My thesis work is on developing a Question Answering (QA) system using real-time social media. The objective is to provide a social media analytics platform to get latest information on health related queries/questions. This project supports predefined and dynamic interpretation of natural language health-related queries.Once query interpretation is completed then system extract answers from corpus (real time and voluminous)is very challenging, for that we used many data mining and information retrieval techniques for extractinganswers, and Hadoop and Apache Storm for handling data.

REFERENCES

Dr. Amit P. ShethDirector, Kno.e.sis Center, Wright State Universityamit.sheth[at]wright.edu

Dr. T. K. PrasadProfessor, Computer Science,Wright State Universityt.k.prasad[at]wright.edu