understanding technologies

67
Mainframes A large powerful computer, often serving many connected terminals and usually used by large complex organizations. In the "ancient" mid-1960s, all computers were mainframes, since the term referred to the main CPU cabinet. Today, it refers to a class of ultra-reliable medium and large-scale servers designed for enterprise-class and carrier-class operations. The first mainframe vendors were Burroughs, Control Data, GE, Honeywell, IBM, NCR, RCA and Univac, otherwise known as "IBM and the Seven Dwarfs." After GE and RCA's computer divisions were absorbed by Honeywell and Univac respectively, the mainframers were known as "IBM and the BUNCH." Mainframe applications are written in programming languages such as COBOL, CICS and various 4GLs. Expertise Areas Mainframe- OS : OS/390, MVS, UNIX Languages : JCL, COBOL, REXX, PL/1, C/C++, Assembler, Java Databases : CICS, DB2, IDMS, IMS DB, VSAM Utilities : Syncsort, Quikjob, Fileaid Tools : REVOLVE, CA7 (Job Manager), XPEDITER, ACF2, CA-Examine Mainframe-OS Multiple Virtual Storage (MVS) Multiple Virtual Storage Introduced in 1974, the primary operating system used with IBM mainframes (the others are VM and DOS/VSE). MVS is a batch processing- 1

Upload: jayant-gautam

Post on 08-Apr-2015

111 views

Category:

Documents


10 download

TRANSCRIPT

Page 1: Understanding Technologies

Mainframes

A large powerful computer, often serving many connected terminals and usually used by large complex organizations.

In the "ancient" mid-1960s, all computers were mainframes, since the term referred to the main CPU cabinet. Today, it refers to a class of ultra-reliable medium and large-scale servers designed for enterprise-class and carrier-class operations.

The first mainframe vendors were Burroughs, Control Data, GE, Honeywell, IBM, NCR, RCA and Univac, otherwise known as "IBM and the Seven Dwarfs." After GE and RCA's computer divisions were absorbed by Honeywell and Univac respectively, the mainframers were known as "IBM and the BUNCH." Mainframe applications are written in programming languages such as COBOL, CICS and various 4GLs.

Expertise Areas

Mainframe- OS : OS/390, MVS, UNIX

Languages : JCL, COBOL, REXX, PL/1, C/C++, Assembler, Java

Databases : CICS, DB2, IDMS, IMS DB, VSAM

Utilities : Syncsort, Quikjob, Fileaid

Tools : REVOLVE, CA7 (Job Manager), XPEDITER, ACF2, CA-Examine

Mainframe-OS

Multiple Virtual Storage (MVS)Multiple Virtual Storage Introduced in 1974, the primary operating system used with IBM mainframes (the others are VM and DOS/VSE). MVS is a batch processing-oriented operating system that manages large amounts of memory and disk space. Online operations are provided with CICS, TSO and other system software.

MVS/Extended Architecture

MVS/XA manages the enhancements introduced in 1981 with IBM's 370/XA architecture, including 2GB of virtual memory.

MVS/Enterprise Systems Architecture

MVS/ESA manages the enhancements made to large scale mainframes, including 16TB of virtual memory, introduced in

1

Page 2: Understanding Technologies

1988 with IBM's ESA/370 architecture. MVS/ESA ran on all models of the System/390 ES/9000 product line introduced in 1990.

OS/390 In 1996, MVS/ESA was packaged with an extensive set of utilities that was renamed OS/390. The name MVS was still used to refer to the base control program in OS/390.

Languages

COBOL

Common Business Oriented Language or COBOL is a high-level programming language that has been the primary business application language on mainframes and minis. It is a compiled language and was one of the first high-level languages developed.

CICS

Customer Information Control System or CICS is a TP monitor from IBM that was originally developed to provide transaction processing for IBM mainframes. It controls the interaction between applications and users and lets programmers develop screen displays without detailed knowledge of the terminals used. It provides terminal routing, password security, transaction logging for error recovery and activity journals for performance analysis.

CICS has also been made available on non-mainframe platforms including the RS/6000, AS/400 and OS/2-based PCs.

CICS commands are written along with and into the source code of the applications, typically COBOL, although assembly language, PL/I and RPG are also used. CICS implements SNA layers 4, 5 and 6.

PL/I

Programming Language One or PL/I is an imperative computer programming language designed for scientific, engineering, and business applications. It is one of the most feature-rich programming languages and one of the very first in the highly-feature-rich category.[citation needed] It has been used by various academic, commercial and industrial users since it was introduced in the early 1960s, and is still actively used today.

PL/I's principal domain is data processing; it supports recursion and structured programming. The language syntax is English-like and suited for describing

2

Page 3: Understanding Technologies

complex data formats, with a wide set of functions available to verify and manipulate them.

REXX

Restructured eXtended eXecutor or REXX is an interpreted computer programming language which was developed at IBM. It is a structured high-level programming language which was designed to be both easy to learn and easy to read. Both commercial and open source interpreters for REXX are available on a wide range of computing platforms, and compilers are available for IBM mainframes.

Job Control Language (JCL)

JCL is a scripting language used on IBM mainframe operating systems to instruct the Job Entry Subsystem (that is, JES2 or JES3) on how to run a batch program or start a subsystem.

Databases

DB2

A relational DBMS from IBM that was originally developed for its mainframes. It is a full-featured SQL language DBMS that has become IBM's major database product. Known for its industrial strength reliability, IBM has made DB/2 available for all of its own platforms, including OS/2, OS/400, AIX (RS/6000) and OS/390, as well as for Solaris on Sun systems and HP-UX on HP 9000 workstations and servers.

IDMS

Integrated Database Management System or IDMS is a (network) CODASYL database management system first developed at B.F. Goodrich and later marketed by Cullinane Database Systems (renamed Cullinet in 1983). Since 1989 the product has been owned by Computer Associates, who renamed it CA-IDMS.

Virtual Storage Access Method (VSAM)

VSAM is an IBM disk file storage scheme first used in the OS/VS2 operating system, later used throughout the Multiple Virtual Storage (MVS) architecture and now in z/OS.

IMS

IBM Information Management System (IMS) is a joint hierarchical database and information management system with extensive transaction processing capability.

3

Page 4: Understanding Technologies

Utilities

Syncsort

Syncsort keeps enterprise data organized and accessible

Quikjob/Quikcode

A PC language similar to Quikjob on the mainframe.

Fileaid

File-AID provides an test data management workbench for mainframe and distributed environments, enabling you to quickly and easily browse & copy.

Tools

REVOLVE

CA7 (Job Manager)

XPEDITER

ACF2

CA-Examine

4

Page 5: Understanding Technologies

Mid range Business Computers

The System i (formerly known as iSeries, AS/400, and Application System/400) is a computer platform produced by IBM. It was officially introduced in 1988. It was then renamed to the eServer iSeries in 2000 as part of IBM's e-Server branding initiative. Now with the global move of the server and storage brands to the System brand with the Systems Agenda, the family has been renamed to System i in 2006, with the POWER5-based members of the series being called the System i5.

AS/400

(Application System/400) The earlier generation and original name of IBM's iSeries and i5 families of midrange business computers. Introduced in 1988, the AS/400 evolved into the iSeries in 2000 and the i5 in 2004. When first introduced, the AS/400 was considered a "minicomputer."

The vast majority of AS/400 commands were written by IBM developers to perform system level tasks like compiling programs, backing up data, changing system configurations, displaying system object details, or deleting them. Commands are not limited to systems level concerns and can be drafted for user applications as well.

CL/400

The AS/400 control language (CL) is a computer programming scripting language bearing a resemblance to JCL IBM System and consisting of an ever expanding set of command objects (*CMD) used to invoke traditional AS/400 programs and/or get help on what those programs do. CL can also be used to create CL programs (congruent to shell scripts) where there are additional commands that provide program-like functionality (GOTO, IF/ELSE, variable declaration, file input, etc.)

OS/400

The AS/400 includes an extensive library-based operating system, OS/400, and is also capable of supporting multiple instances of AIX, Linux, Lotus Domino, Microsoft Windows 2000 and Windows Server 2003. While OS/400, AIX, Linux and Lotus Domino are supported on the POWER processors, Windows is supported with either single-processor internal blade servers (IXS) or externally-linked multiple-processor servers (IXA).

5

Page 6: Understanding Technologies

RPG/400

RPG/400 with a much cleaner syntax, and tighter integration with the integrated database. This language became the mainstay of development on the AS/400, and its editor was a simple line editor with prompt templates for each specification (type of instruction).

DB2/400

The AS/400 version of DB2 from IBM

6

Page 7: Understanding Technologies

Enterprise Resource Planning (ERP) systems integrate (or attempt to integrate) all data and processes of an organization into a unified system. A typical ERP system will use multiple components of computer software and hardware to achieve the integration. A key ingredient of most ERP systems is the use of a unified database to store data for the various system modules

Different products of ERP?

SAP, BAAN, JD Edwards, Oracle Financials, Siebel, PeopleSoft. Among all the ERP’s most of the companies implemented or trying to implement SAP because of number of advantages over other ERP packages.

SAP OraclePeoplesoftJD Edwards

Financials Human Resources

Customer Relationship Management

Supplier Relationship Management

Product Lifecycle Management

Supply Chain Management

Business Intelligence

The FI (Financial Accounting) Module  components.

The FI Module comprises several sub-modules as follows:

Accounts Receivables Accounts Payable Asset Accounting Bank Accounting Consolidation Funds Management General Ledger

Accounts Receivables records all account postings generated as a result of Customer sales activity.

These postings are automatically updated in the General Ledger.

Accounts Payable records account postings generated as a result of Vendor purchasing activity. Automatic postings are generated in the General Ledger as well.

Asset Accounting is utilized for managing your company’s Fixed Assets.

7

Page 8: Understanding Technologies

Bank Accounting allows for management of bank transactions in the system including cash management.

Consolidation enables the combining of financial statements for multiple entities within an organization. These statements provide an overview of the financial position of the company as a whole.

Funds Management allows management to set budgets for revenues and expenses within your company as well as track these to the area of responsibility.

General Ledger is fully integrated with the other SAP Modules. It is within the General Ledger that all accounting postings are recorded. These postings are displayed in real-time providing up-to-date visibility of the financial accounts.

 CO Controlling – basically your internal cost/management accounting, including

Cost elements

Cost centres

Profit centres

Internal orders

Activity based costing

Product costing

CO (Controlling) Module provides supporting information to Management for the purpose of planning, reporting, as well as monitoring the operations of their business.

Some of the components of the CO(Controlling) Module are as follows: 

·         Cost Element Accounting

·         Cost Center Accounting

·         Internal Orders

·         Activity-Based Costing ( ABC)

·         Product Cost Controlling

·         Profitability Analysis

·         Profit Center Accounting

Cost Element Accounting component provides information which  includes the costs and revenue for an organization. These postings are automatically updated from  FI (Financial Accounting) to CO (Controlling).

Cost Center Accounting  provides information on the costs incurred by your business.

8

Page 9: Understanding Technologies

Internal Orders provide a means of tracking costs of a specific job , service, or task. Internal Orders are used as a method to collect those costs and business transactions related to the task.

Activity-Based Costing allows a better definition of the source of costs to the process driving the cost.

Product  Cost Controlling allows management the ability to analyze their product costs and to make decisions on the optimal price(s) to market their products. It is within this module of CO (Controlling) that planned, actual and target values are analyzed. Sub-components of the module are: 

·         Product Cost Planning which includes Material Costing( Cost estimates with Quantity structure, Cost estimates without quantity structure, Master data for Mixed Cost Estimates, Production lot Cost Estimates) , Price Updates, and Reference and Simulation Costing. 

·         Cost Object Controlling includes Product Cost by Period, Product Cost by Order, Product Costs by Sales Orders, Intangible Goods and Services, and CRM Service Processes. 

·         Actual Costing/Material Ledger includes Periodic Material valuation, Actual Costing, and Price Changes. 

Profitability Analysis allows Management the ability to review information with respect to the company’s profit or contribution margin by business segment. 

Account-Based Analysis ·        

Cost-Based Analysis

Profit Center Accounting provides visibility of an organization’s profit and losses by profit center.

 AM Asset Management – track, value and depreciate your assets, including

Purchase

Sale

Depreciation

Tracking

PS Project Systems – manage your projects, large and small, including

Make to order

Plant shut downs (as a project)

Third party billing (on the back of a project)

HR Human Resources – ah yes, people, including

Employment history

Payroll

9

Page 10: Understanding Technologies

Training

Career management

Succession planning

PM Plant Maintenance – maintain your equipment (e.g. a machine, an oil rig, an aircraft etc),

including

Labour

Material

Down time and outages

MM Materials Management – underpins the supply chain, including

Requisitions

Purchase orders

Goods receipts

Accounts payable

Inventory management

BOM’s

Master raw materials, finished goods etc

QM Quality Management – improve the quality of your goods, including

Planning

Execution

Inspections

Certificates

PP Production Planning – manages your production process, including

Capacity planning

Master production scheduling

Material requirements planning

Shop floor

SD Sales and Distribution – from order to delivery, including

RFQ

Sales orders

Pricing

Picking (and other warehouse processes)

Packing

Shipping

CA Cross Application – these lie on top of the individual modules, and include

10

Page 11: Understanding Technologies

WF – workflow

BW – business information warehouse

Office – for email

Workplace

Industry solutions

New Dimension products such as CRM, PLM, SRM, APO etc

What is SAP?

SAP, started in 1972 by five former IBM employees in Mannheim, Germany, states that it is the world's largest inter-enterprise software company and the world's fourth-largest independent software supplier, overall. SAP applications, built around their latest R/3 system, provide the capability to manage financial,

asset, and cost accounting, production operations and materials, personnel, plants, and archived

documents. The R/3 system runs on a number of platforms including Windows 2000 and uses the

client/server model. The latest version of R/3 includes a comprehensive Internet-enabled

package.

SAP has recently recast its product offerings under a comprehensive Web interface, called

mySAP.com, and added new e-business applications, including customer relationship

management (CRM) and supply chain management (SCM).

.

SAP are now moving away from a past term of ‘Modules’ to what they now prefer to be know as

‘solutions’. These new 'solutions' are:

Financials

Human Resources

Customer Relationship Management

Supplier Relationship Management

Product Lifecycle Management

Supply Chain Management

Business Intelligence The 'modules' (As they were formally known) which make up the SAP framework are:

FI Financial Accounting – essentially your regulatory ‘books of record’, including

General ledger

Book close

Tax

Accounts receivable

Accounts payable

11

Page 12: Understanding Technologies

Consolidation

Special ledgers

CO Controlling – basically your internal cost/management accounting, including

Cost elements

Cost centers

Profit centers

Internal orders

Activity based costing

Product costing

AM Asset Management – track, value and depreciate your assets, including

Purchase

Sale

Depreciation

Tracking

PS Project Systems – manage the projects, large and small, including

Make to order

Plant shut downs (as a project)

Third party billing (on the back of a project)

HR Human Resources – people, including

Employment history

Payroll

Training

Career management

Succession planning

PM Plant Maintenance – maintaining the equipment (e.g. a machine, an oil rig, an aircraft etc),

including

Labour

12

Page 13: Understanding Technologies

Material

Down time and outages

MM Materials Management – underpins the supply chain, including

Requisitions

Purchase orders

Goods receipts

Accounts payable

Inventory management

BOM’s

Master raw materials, finished goods etc

QM Quality Management – improve the quality of your goods, including

Planning

Execution

Inspections

Certificates

PP Production Planning – manages your production process, including

Capacity planning

Master production scheduling

Material requirements planning

Shop floor

SD Sales and Distribution – from order to delivery, including

RFQ

Sales orders

Pricing

Picking (and other warehouse processes)

Packing

13

Page 14: Understanding Technologies

Shipping

CA Cross Application – these lie on top of the individual modules, and include

WF – workflow

BW – business information warehouse

Office – for email

Workplace

Industry solutions

New Dimension products such as CRM, PLM, SRM, APO etc

R3:

R/3 is the comprehensive set of integrated business applications from SAP, the German company that states it is the market and technology leader in business application software. R/3 replaced an earlier system, R/2, which is still in use. R/3 uses the client/server model and provides the ability to store, retrieve, analyze, and process in many ways corporate data for financial analysis, production operation, human resource management, and most other business processes.

SCM:

Supply chain management (SCM) is the oversight of materials, information, and finances as they move in a process from supplier to manufacturer to wholesaler to retailer to consumer. Supply chain management involves coordinating and integrating these flows both within and among companies.

CRM:

CRM (customer relationship management) is an information industry term for methodologies, software, and usually Internet capabilities that help an enterprise manage customer relationships in an organized way.

Netweaver: NetWeaver is an application builder from SAP for integrating business processes and databases from a number of sources while exploiting the leading Web services technologies.OR: NetWeaver is an application builder from SAP for integrating business processes and databases from a number of sources while exploiting the leading Web services technologies NetWeaver has been tagged as a product that could help industry adoption of Web services.

ABAP: ABAP (Advanced Business Application Programming) is a programming language for developing applications for the SAP R/3 system, a widely-installed business application subsystem. The latest version, ABAP Objects, is object-oriented programming.

ABAP Objects uses a single inheritance model and full support for object features such as encapsulation, polymorphism, and persistence.

14

Page 15: Understanding Technologies

ABAP Workbench:ABAP Workbench is a set of programs for developing enterprise resource management (ERM) applications that run in the R/3 subsystem from SAP. The latest version includes ABAP Objects, an object-oriented programming language.

ALE: Application Link Enabling (ALE) is a mechanism for the exchange of business data between loosely-coupled R/3 applications built by customers of SAP, the enterprise resource management program.

IDOC: IDoc (for intermediate document) is a standard data structure for electronic data interchange (EDI) between application programs written for the popular SAP business system or between an SAP application and an external program. IDocs serve as the vehicle for data transfer in SAP's Application Link Enabling (ALE) system.

BAPi: BAPI (Business Application Programming Interface) is a set of interfaces to object-oriented programming methods that enable a programmer to integrate third-party software into the proprietary R/3 product from SAP. For specific business tasks such as uploading transactional data, BAPIs are implemented and stored in the R/3 system as remote function call (RFC) modules.

Smart Forms/Smart Script:

SAP Smart Forms is introduced in SAP Basis Release 4.6C as the tool for creating and maintaining forms.  SAP Smart Forms allow you to execute simple modifications to the form and in the form logic by using simple graphical tools;

SAP Script: Often there are instances where an output from a SAP program is required on a physical paper in a pre-designed format. Using normal ABAP code this is not possible. Instead SAP provides an object called SAPSCRIPT to generate such kind of documents which can contain logos, tables and other objects and which can look like pre-printed documents.

REPORTS:

This id basically used to develop for displaying Material Number, Plant and Storage data, inputting Material No. in the selection-screen, Customer’s Payment History, the details of Purchase Orders, Interactive report that displays Customer Number, Customer Name and his contact address in the basic list. Later in the detailed list it display Customer Number, Customer Name, Distribution Channel, Division, Partner No, Partner Function and descriptions. BDC:Among these techniques, Batch Data Communication (BDC) is the oldest one. BDC is not bi-directional; it is an integration tool in its typical form. It can only be used for uploading data into RJ3. The purpose of the Batch Data Communication is to transfer data. The BDC can transfer data from one SAP System to another SAP System or can transfer data from non-SAP System to SAP System too. To transfer data BDC uses normal transaction codes. Two methods are provided to BDC to do this work. 

Techincal Consultant

15

Page 16: Understanding Technologies

Responsibilities

As a Technical Consultant, having responsibilities of developing technical specification

according to the requirement given, coding, unit testing of code and delivering high quality

product with the user’s flexibility.

Material Code Generator

Developed a Dialog Program to generate Material Codes. Using this transaction user can generate material codes with reference of product such as Finished, Stitched, Dyed, Grayed and Yarn products.

Uploaded the generated codes to the Material Master table using BDC program.

Developed an ALV report to list out the status of codes that are generated.

Lab Testing Details

Developed a module pool program to update test details of materials which are procured or in-house production

Developed a print output for each test reports using SMARTFORMS

Developed an ALV Hierarchical Report to list out the test results

Reports

Developed a Hierarchical ALV report to track the Pending Purchase Orders and Production Orders against the booked customer order.

Developed an ALV report to list out the sales orders and invoices that were assigned to the Letter of Credit and Payment dues for the received LC.

Developed Hierarchical ALV report, having details of Customer and Vendor aging based on GRN wise, Partially Payments and Invoiced date.

SMARTFORMS

Developed a smartform to print Shipment Summary List used to ensure correct vehicle, weight and deliveries when leaving the plant & assigned to the output type of shipping.

Developed a smartform for invoice document sent to the customer against the goods that the client has sent through high sea sales.

Developed various output for various Sales and Finance documents

Export Sales invoices and packing List

Commercial Invoice

Weaving Invoices and Packing list

Debit and Credit Note

SAP Scripts

16

Page 17: Understanding Technologies

Modified various standard layouts in Finance transactions

In Transaction F110, modified standard SAP script layout for Supplier Remittance Advice as part of the payment process and then sent to the suppliers.

In Transaction FBL1N, modified standard SAP script Layout for vendor account statement details to meet the company’s needs.

BADI

Developed BADI for Billing Transaction (VF02) for downloading data to third party

software – EXIM

User Exits

Developed User Exit for Pricing procedure to apply tool rental discounts

Developed User Exit for Sales Order (VA02) for downloading data to third party software – EXIM

BAPIDeveloped an ALV report to download data from SAP into an Excel file and upload data using BAPI for special materials where all the specified MRP View fields can be changed

Roles and Responsibilties of Consultant

Consultant can work as a functional,Technical and Techno functional as well

Here are some generalized roles and responsibilities of consultant

Functinal Consultant

Understanding business process, study and analyze workflow to design solutions (As-Is and To-Be) Analysis and Document preparation. Set up organizational structure for MM. Configuration of Logistics General Parameters for Material Master, Vendor master, which

includes Number range assignment, screen sequence, field Configuration & creation of Material groups.

Configuration of documents types & Number range assignment for PR, RFQ & PO. Configured Info Records, Source List and Quota Arrangement. Configuration of Release Strategy with classification for PR & PO Configuration of pricing procedure. Worked on subcontracting, stock transport order (inter and intra). Made required settings for issuing materials using stock determination strategy. Valuation of materials and account determination with automatic account assignment. Physical Inventory: Inventory count and posting inventory differences. Carried out Legacy data uploading with LSMW tool using standard batch/direct input

method. Proving solutions to the end user issues.

Oracle E

Oracle E-Business Suite is the industry's only complete and integrated set of enterprise applications, working together seamlessly to streamline every area of your business—from sales, service, and marketing, through financials and human resources, to supply chain and manufacturing.

Oracle E- Business Suite - Industry Applications

17

Page 18: Understanding Technologies

Oracle E-Business Suite 11i.10 offers over 2,100 new capabilities, half of which meet specific industry needs, including:

Financial Services: SOP documentation and auditing for compliance with Sarbanes-Oxley and other regulations Healthcare: Medication administration, patient encounter-specific financial information, integrated patient care and operational intelligence

Manufacturing/High Technology: Option-dependent sourcing, automated spare parts return and repair processing, international drop shipments, distribution planning

The E-Business Suite (current version: 12, released January 31, 2007), contains several product

lines, including:

Oracle Financials Oracle HRMS Oracle Sales

Oracle Financials

Oracle Financials includes several dozen "modules", each of them separately licensed within the

E-Business Suite. The modules include:

Assets - asset-tracking and maintenance. Cash Management - electronic bank statement, loading, and reconciliation features. Daily Business Intelligence for Financials - up-to-date graphical views of coporate

performance using some of Oracle's pre-built KPIs. DBIs serve to show up-to-date snapshots of different areas of a business, such as:

o Accounts Payables - Supplier Ageingo Accounts Receivable - Customer Ageing

General Ledger - including a "Financial Statement Generator" to write custom reports which can span across multiple companies.

Payments - payment facilities to pay in multiple currencies using multiple payment methods

Payables - ability to match electronically to purchase orders generated in Oracle Purchasing.

Receivables - raising sales-invoices, dunning letters and statements. In-built credit-control facilities to manage customer credit.

Oracle Ledger includes ERP features like inter-company transactions, consolidations , multiple

currency transactions and drill-down facilities.

Oracle SCM

Oracle SCM (for supply-chain management) can include:

Advanced Procurement Demantra Demand-driven planning Logistics Manufacturing Order management Supply Chain Execution Supply Chain Planning

18

Page 19: Understanding Technologies

Oracle HRMS

Oracle HRMS offers an integrated set of applications software products specific to the area of

human resource management systems (HRMS). It belongs to the Oracle E-Business Suite of

Oracle Corporation.

Oracle HRMS includes several modules, specifically:

Core HR Payroll OTL (Oracle Time & Labour) Oracle Learning Management (OLM) Self-service HR Oracle Advanced Benefits iRecruitment

Peoplesoft

PeopleSoft, Inc. was a company that provided Human resource management systems (HRMS), customer relationship management, Manufacturing, Financials, Enterprise Performance Management and Student Administration software solutions to large corporations, governments, and organizations.

2003 buyout of J.D. Edwards by PeopleSoft

In June 2003, the J.D. Edwards board agreed to an offer under which PeopleSoft would acquire

J.D. Edwards; the takeover was completed in July. OneWorld was added to PeopleSoft’s

software line and was renamed EnterpriseOne.

2004 buyout of PeopleSoft by Oracle

In December 2004, Oracle completed the acquisition of PeopleSoft and has, since then,

continued to support products that were created by J.D. Edwards. The final purchase went

through in January of 2005. The PeopleSoft brand names in relation to the J.D. Edwards

offerings, EnterpriseOne and WorldSoftware, were retired. Today, the products are called

respectively, Oracle JDEdwards EnterpriseOne and Oracle JDEdwards World.

19

Page 20: Understanding Technologies

Data Warehouse :

A data warehouse is the main repository of an organization's historical data, its corporate memory. It contains the raw material for management's decision support system. The critical factor leading to the use of a data warehouse is that a data analyst can perform complex queries and analysis, such as data mining, on the information without slowing down the operational systems...

Main Features areSubject-oriented 

The data in the database is organized so that all the data elements relating to the same real-world event or object are linked together;

Time-variant 

The changes to the data in the database are tracked and recorded so that reports can be produced showing changes over time;

Non-volatile  Data in the database is never over-written or deleted - once committed, the data is static,

read-only, but retained for future reporting; and Integrated  The database contains data from most or all of an organization's operational applications,

and that this data is made consistent. The processing load of reporting reduced the response time of the operational systems, The database designs of operational systems were not optimized for information analysis and

reporting, Most organizations had more than one operational system, so company-wide reporting could

not be supported from a single system, and Development of reports in operational systems often required writing specific computer

programs which was slow and expensive

Off line Operational Databases:  Data warehouses in this initial stage are developed by simply copying the database of an

operational system to an off-line server where the processing load of reporting does not impact on the operational system's performance.

Off line Data Warehouse:  Data warehouses in this stage of evolution are updated on a regular time cycle (usually daily,

weekly or monthly) from the operational systems and the data is stored in an integrated reporting-oriented data structure.

Real Time Data Warehouse  : Data warehouses at this stage are updated on a transaction or event basis, every time an

operational system performs a transaction (e.g. an order or a delivery or a booking etc.) Integrated Data Warehouse:  Data warehouses at this stage are used to generate activity or transactions that are passed

back into the operational systems for use in the daily activity of the organization.

OLTP databases are efficient because they are typically only dealing with the information around a single transaction. In reporting and analysis, thousands to billions of transactions may need to be reassembled imposing a huge workload on the relational database. Given enough time the software can usually return the requested results, but because of the negative performance impact on the machine and all of its hosted applications, data warehousing

20

Page 21: Understanding Technologies

professionals recommend that reporting databases be physically separated from the OLTP database.

AdvantagesThere are many advantages to using a data warehouse, some of them are:

Data warehouses enhance end-user access to a wide variety of data. Decision support system users can obtain specified trend reports, e.g. the item with the

most sales in a particular area within the last two years. Data warehouses can be a significant enabler of commercial business applications,

particularly customer relationship management (CRM) systemsConcerns

Extracting, transforming and loading data consumes a lot of time and computational resources.

Data warehousing project scope must be actively managed to deliver a release of defined content and value.

Compatibility problems with systems already in place. Security could develop into a serious issue, especially if the data warehouse is web

accessible. Data Storage design controversy warrants careful consideration and perhaps prototyping

of the data warehouse solution for each project's environments

WhatareETLTools?

ETL Tools are meant to extract, transform and load the data into Data Warehouse for decision making. Before the evolution of ETL Tools, the above mentioned ETL process was done manually by using SQL code created by programmers. This task was tedious and cumbersome in many cases since it involved many resources, complex coding and more work hours. On top of it, maintaining the code placed a great challenge among the programmers.There are a number of ETL tools available in the market to do ETL process the data according to business/technical requirements. Following are some those.

Popular ETL Tools

Tool Name Company Name

Informatica Informatica Corporation

DT/Studio Embarcadero Technologies

DataStage IBM

Ab Initio Ab Initio Software Corporation

Data Junction Pervasive Software

Oracle Warehouse Builder

Oracle Corporation

Microsoft SQL Server Integration

Microsoft

TransformOnDemand Solonde

21

Page 22: Understanding Technologies

Transformation Manager

ETL Solutions

Informatica

Informatica is a powerful ETL tool from Informatica Corporation, a leading provider of enterprise data integration software and ETL softwares.

The important Informatica Components are: Power Exchange Power Center Power Center Connect Power Exchange Power Channel Metadata Exchange Power Analyzer Super Glue

In Informatica, all the Metadata information about source systems, target systems and transformations are stored in the Informatica repository. Informatica's Power Center Client and Repository Server access this repository to store and retrieve metadata.

SourceTarget:

Consider a Bank that has got many branches throughout the world. In each branch data may be stored in different source systems like oracle, sql server, terradata, etc. When the Bank decides to integrate its data from several sources for its management decisions, it may choose one or more systems like oracle, sql server, terradata, etc. as its data warehouse target. Many organisations prefer Informatica to do that ETL process, because Informatica is more powerful in designing and building data warehouses. It can connect to several sources and targets to extract meta data from sources and targets, transform and load the data into target systems.

Guidelines to work with Informatica Power Center Repository: This is where all the metadata information is stored in the Informatica

suite. The Power Center Client and the Repository Server would access this repository to retrieve, store and manage metadata.

Power Center Client: Informatica client is used for managing users, identifiying source and target systems definitions, creating mapping and mapplets, creating sessions and run workflows etc.

Repository Server: This repository server takes care of all the connections between the repository and the Power Center Client.

Power Center Server: Power Center server does the extraction from source and then loading data into targets.

Designer: Source Analyzer, Mapping Designer and Warehouse Designer are tools reside within the Designer wizard. Source Analyzer is used for extracting metadata from source systems.

Mapping Designer is used to create mapping between sources and targets. Mapping is a pictorial representation about the flow of data from source to target.Warehouse Designer is used for extracting metadata from target systems or metadata can be created in the Designer itself.

Data Cleansing: The PowerCenter's data cleansing technology improves data quality by validating, correctly naming and standardization of address data. A person's address may not be same in all source systems because of typos and postal code, city name

22

Page 23: Understanding Technologies

may not match with address. These errors can be corrected by using data cleansing process and standardized data can be loaded in target systems (data warehouse).

Transformation: Transformations help to transform the source data according to the requirements of target system. Sorting, Filtering, Aggregation, Joining are some of the examples of transformation. Transformations ensure the quality of the data being loaded into target and this is done during the mapping process from source to target.

Workflow Manager: Workflow helps to load the data from source to target in a sequential manner. For example, if the fact tables are loaded before the lookup tables, then the target system will pop up an error message since the fact table is violating the foreign key validation. To avoid this, workflows can be created to ensure the correct flow of data from source to target.

Workflow Monitor: This monitor is helpful in monitoring and tracking the workflows created in each Power Center Server.

Power Center Connect: This component helps to extract data and metadata from ERP systems like IBM's MQSeries, Peoplesoft, SAP, Siebel etc. and other third party applications.

Power Center Exchange: This component helps to extract data and metadata from ERP systems like IBM's MQSeries, Peoplesoft, SAP, Siebel etc. and other third party applications.

Role & Responsibilities: Designed & developed the various kinds of Mappings, Sessions and Workflows. Designed Technical design docs based on understanding of high level docs. Enhancement of existing Informatica mappings. Developed Unix shell scripts for run the informatica workflow. Used Different Tasks in Workflow such as Command, Session, Email, Decision etc. Identified performance bottleneck in ETL using diff techniques & resolved Performance

issues. Designed Unit Test Cases & used it during Unit Testing. Involved in TAT & UAT support.

Informatica

Power Exchange: Informatica Power Exchange as a stand alone service or along with Power Center, helps organizations leverage data by avoiding manual coding of data extraction programs. Power Exchange supports batch, real time and changed data capture options in main frame(DB2, VSAM, IMS etc.,), mid range (AS400 DB2 etc.,), and for relational databases (oracle, sql server, db2 etc) and flat files in unix, linux and windows systems.

Power Channel: This helps to transfer large amount of encrypted and compressed data over LAN, WAN, through Firewalls, tranfer files over FTP, etc.Meta Data Exchange: Metadata Exchange enables organizations to take advantage of the time and effort already invested in defining data structures within their IT environment when used with Power Center. For example, an organization may be using data modeling tools, such as Erwin, Embarcadero, Oracle designer, Sybase Power Designer etc for developing data models. Functional and technical team should have spent much time and effort in creating the data model's data structures(tables, columns, data types, procedures, functions, triggers etc). By using meta deta exchange, these data structures can be imported into power center to identifiy source and

23

Page 24: Understanding Technologies

target mappings which leverages time and effort. There is no need for informatica developer to create these data structures once again.

Power Analyzer: Power Analyzer provides organizations with reporting facilities. PowerAnalyzer makes accessing, analyzing, and sharing enterprise data simple and easily available to decision makers. PowerAnalyzer enables to gain insight into business processes and develop business intelligence.With PowerAnalyzer, an organization can extract, filter, format, and analyze corporate information from data stored in a data warehouse, data mart, operational data store, or otherdata storage models. PowerAnalyzer is best with a dimensional data warehouse in a relational database. It can also run reports on data in any table in a relational database that do not conform to the dimensional model.Super Glue: Superglue is used for loading metadata in a centralized place from several sources. Reports can be run against this superglue to analyze meta data.Power Mart: Power Mart is a departmental version of Informatica for building, deploying, and managing data warehouses and data marts. Power center is used for corporate enterprise data warehouse and power mart is used for departmental data warehouses like data marts. Power Center supports global repositories and networked repositories and it can be connected to several sources. Power Mart supports single repository and it can be connected to fewer sources when compared to Power Center. Power Mart can extensibily grow to an enterprise implementation and it is easy for developer productivity through a codeless environment.

Role & Responsibility: Involved in analyzing the systems and gathering of requirements. Designed various mappings for extracting data from various sources involving flat files and

relational tables. Using designer to create source definitions, design targets, create mappings and develop

transformations. Created different transformations for loading the data into oracle database e.g. Source

Qualifier, Joiner transformations, Update Strategy, Lookup transformations, Filter, Rank transformations, Expression, Aggregator, and Sequence Generator.

Created sessions using Informatica power center server.

OLAP :

OLAP, an acronym for Online Analytical Processing is an approach that helps organization to take advantages of DATA. Popular OLAP tools are Cognos, Business Objects, Micro Strategy etc. OLAP cubes provide the insight into data and helps the topmost executives of an organization to take decisions in an efficient manner.

Technically, OLAP cube allows one to analyze data across multiple dimensions by providing multidimensional view of aggregated, grouped data. With OLAP reports, the major categories like fiscal periods, sales region, products, employee, promotion related to the product can be ANALYZED very efficiently, effectively and responsively. OLAP applications include sales and customer analysis, budgeting, marketing analysis, production analysis, profitability analysis and forecasting etc.

24

Page 25: Understanding Technologies

Short for Online Analytical Processing, a category of software tools that provides analysis of data stored in a database. OLAP tools enable users to analyze different dimensions of multidimensional data. For example, it provides time series and trend analysis views. OLAP often is used in data mining. The chief component of OLAP is the OLAP server, which sits between a client and a database management systems (DBMS). The OLAP server understands how data is organized in the database and has special functions for analyzing the data. There are OLAP servers available for nearly all the major database systems.

What is COGNOS?

COGNOS is a powerful application of two facets – PowerPlay and Impromptu. They allow you, the user, to bring together information from the Corporate Database, Concept, JDE and other sources for analysis and reporting purposes.

This information is made available by the COGNOS Administrator.

How do you access COGNOS?COGNOS is accessed via your web browser and will be set up as an option in your “Favourites” folder.There are many advantages to having web-based COGNOS, for example, the user is not limited to his or her own computer. COGNOS can be accessed from home and overseas.

What is Cognos ReportNet?Cognos ReportNet is a Web-based, business intelligence suite of reporting applications used to create and run reports.  It is accessible through a Web browser and does not require any additional software or hardware changes by your Information Technology (IT) staff. Cognos ReportNet easily exports to multiple formats such as Excel, PDF, XML, HTML, and CSV. WSU will be using the following three applications from Cognos ReportNet:

 Roles And Responsibilities: Understanding the business requirements and designing application to meet their needs. Involve in designing the report requirement templates Develop High Level design documents from report requirement templates Involved in developing drill through reports. Designed reports with various cascading prompt features. Designed reports with various types of filters like detail filter, summary filter and group filter. Designed reports with parameterized filters. Designed a report with conditional formatting. Create the job with sequential report scheduling. Developed reports with various formatting features in eliminating the duplicates.

COGNOS IMPROMPTU

Impromptu allows the System Administrator to design specific reports for users. Impromptu reports, when accessed, retrieve up to the minute information so that the reports are always

25

Page 26: Understanding Technologies

current. PowerPlay allows users to view reports based on a cube, which displays information current at the time the cube was last updated, displayed with the cube name. Impromptu. Report authors use Impromptu to create business-context reports. Report authors can author virtually any report using Impromptu's superior frame-based reporting interface. Report data can come from any source, and reports can be deployed to Impromptu users across LANs and WANs, as well as to mobile users.

Roles And Responsibility Understanding existing business model and customer requirements. Created the catalogs, tables, joins and folders for the users and managing & testing of

catalogs with Cognos Impromptu. Created standard filters, calculations, prompts and conditions in catalog. Created the reports using different catalogs as per requirement. Used Cognos Transformer to built multidimensional cubes and used it in Cognos PowerPlay

to build reports. Created Models based on the dimensions, levels and measures required for the analysis.

COGNOS POWERPLAY

Cognos PowerPlay is a business performance measurement (BPM) analysis and reporting solution for online analytical processing (OLAP) applications. PowerPlay enables users to explore and analyze data from any angle, and in any combination, allowing for the quick identification of factors that are not readily found using other methods of analysis. These results can then be published to the Cognos Upfront EBI portal, where people who need this information can access it.Cognos PowerPlay enables anyone—beyond traditional report authors and business analysts—to perform their own multidimensional analysis and create reports on OLAP data in a Web, Windows, or Excel environment. With a single mouse click, users can convert these results from PDF format to a dynamic PowerPlay Web report that allows them to explore and analyze the underlying OLAP data and then share their findings with others

Cognos PowerPlay draws information from relational databases to model and build PowerCubes ("Cubes"). Cubes are optimized data sets that enable users to perform analysis with quick response times. They can be small or large, containing more than a billion rows of data and 2 million categories. With such rich and fast-response data sources, you can analyze multiple aspects of your business???to see how a value changes over time; how it compares across geography; how it changes against any other business dimension. PowerPlay reports and Cubes can be accessed by Web clients, or from Windows and Excel clients, all using the same application server.

IBM WebSphere DataStage is an ETL tool and part of the IBM WebSphere Information Integration suite and the IBM Information Server. It uses a graphical notation to construct data integration solutions and is available in various different versions such as the Server Edition and the Enterprise Edition.It was formerly known as Ardent DataStage followed by Ascential DataStage and in 2005 was acquired by IBM and added to the WebSphere family. The October 2006 release of DataStage has integrated it into the new IBM Information Server platform Responsibilities:

Involved in developing jobs using Sequential Files, Hashed files, Aggregator stage, pivot stage,Quality stage, Row merger, Row splitter , Link partitioner , Link collector and IPC stageInvolved in developing sequences using different activities like Job activity , User variable ,Sequencer, Start Loop, Execute command and notification activities with customized triggers

26

Page 27: Understanding Technologies

Roles and Responsibilities

Involved in maintaining the versions of code using version control Involved in unit testing, integration testing and User acceptance test Involved in optimization of the jobs Used different routines, functions and called stored procedures Involved in applying logics like Changed data capture and slowly changing dimensions Developed rules in Qualitystage and used in Qualitystage jobs for the implementation ofstandardization and de-duplication•Integration of Datastage and Qualitystage•Training the team members in Qualitystage

BUSINESS OBJECTS

Business objects are objects in an object-oriented computer program that abstract the entities in the domain that the program is written to represent. For example, an order entry program needs to work with concepts such as orders, line items, invoices and so on. Each of these may be represented by a business object.Business objects are sometimes called domain objects (where the word domain means the business), and a domain model represents the set of domain objects and the relationships between them.Business objects often encapsulate all of the data and business behavior associated with the entity that it represents.Business objects don't necessarily need to represent objects in an actual business, though they often do. They can represent any object related to the domain for which a developer is creating business logic. The term is used to distinguish between the objects a developer is creating or using related to the domain and all the other types of object he or she may be working with such as user interface widgets and database objects such as tables or rows.

Roles and Responsibilities: Involved in the designing, creating, maintaining and distributing the universes. Installing and testing the Business Object setup and the setup of the different users. Understanding existing data base and involved in modifying database structure of Monster

House Documentation of existing BRIO 6.6.4 report. Created analysis sheets for existing reports. Created core universe with common classes and objects and linked it with other universe for

ease of maintenance. Report creation Created Document to set standard for Universe development and report development

AB INITIO

Roles And Responsibilities Review current system configuration for all applicable systems Determine where and if any changes need to be made Perform documentation of the system configuration recommendations Developed Ab Initio graphs and High Level Design document for data movement Developed Ab Initio graphs and Detail Design document for data movement Develop a Migration Strategy to move graphs from the Mainframe to the UNIX environment

27

Page 28: Understanding Technologies

Analyze all the existing JCL's that are used for invoking various graphs. Wrote JCL Scripts Prepare equivalent UNIX shell scripts. Identify the invocation method of the scripts in the new infrastructure and develop new

scripts according to the scheduling software used. Update all the Data Manipulation language (DML's) within each graph where applicable. Convert the EBCDIC data into ASCII format. Migrate all the existing lookup files from old infrastructure to new one. Transform changes to individual graphs depending on the DML's. Updating all the references from MVS to Unix. Understand the existing Copy Books and convert the same to Ab Initio DML's. Update all the parameters to reference the new target infrastructure Understand existing Log Files. Create alerts and responses based Line of Business or Data Feeds. Create Data back up and Recovery Graphs. Involved in Unix Testing. The Run counts should be same. Manual compare of counts. Run the Graphs with few records and compare the results on NFC and NITC. Run the Graphs with Full volume and do a File compare

28

Page 29: Understanding Technologies

1. Operating Systemsa) Windowsb) Unix (AIX, LINUX HP-UX, Solaris)c) RTOSd) Bulnexe) Symbianf) Macintosh

2. Networkinga) Networking Protocalsb) Networking Administrationsc) Networking Securities

3. Databasesa) Oracle – PL/SQLb) SQL Serverc) MS Accessd) MY SQLe) Sybasef) DB2g) Teradatah) Postgres.i) SAP BASIS.j) DB Architecturek) DB Administrationl) DB Development

29

Page 30: Understanding Technologies

1. Operating Systems: An operating system (OS) is the computer program that manages all other programs on the machine. The most commonly-used contemporary desktop OS is Microsoft Windows, with Mac OS X also being well-known. Linux and the BSD are popular Unix-like systems.

Roles & Responsibility:

Process Management - Process management is an operating system's way of dealing with running those multiple processes. On the most fundamental of computers (those containing one processor with one core) multitasking is done by simply switching processes quickly.

Memory Management - An operating system's memory manager coordinates the use of these various types of memory by tracking which one is available, which is to be allocated or deallocated and how to move data between them.

Disk and file systems - All operating systems include support for a variety of file systems.

Networking - Most current operating systems are capable of using the networking protocols. This means that computers running dissimilar operating systems can participate in a common network for sharing resources such as computing, files, printers, and scanners using either wired or wireless connections.

Security- Internal Security - Internal security can be thought of as

protecting the computer's resources from the programs concurrently running on the system.

External Security - Typically an operating system offers various services to other network computers and users. These services are usually provided through ports or numbered access points beyond the operating systems network address. Services include offerings such as file sharing, print services, email, web sites, and file transfer protocols (FTP), most of which can have compromised security.

Graphical user interface – It is the most modern operating system contain Graphical User Interfaces. GUI evolve over time. For ex: Windows has modified its user interface almost every time a new major version of Windows is released, and the Mac OS GUI changed dramatically with the introduction of Mac OS X in 2001.

30

Page 31: Understanding Technologies

Device Drivers - A device driver is a specific type of computer software developed to allow interaction with hardware devices. Typically this constitutes an interface for communicating with the device, through the specific computer bus or communications subsystem that the hardware is connected to, providing commands to and/or receiving data from the device, and on the other end, the requisite interfaces to the operating system and software applications.

Types of Operating System:

a) Windows: Microsoft Windows is the name of several families of software operating systems by Microsoft.

i. SDKii. Windows Threads

iii. Windows All others

b) Unix: Computer operating system, developed by Bell Laboratories. Mostly used for servers and web servers.

i. Unix Commandsii. Unix Threads

iii. Unix IPCiv. Unix Socket Programmingv. Unix System Calls

c) Linux: Linux is a free open-source operating system based on Unix.

i. Linux Commandsii. Linux Threads

iii. Linux IPCiv. Linux Socket Programmingv. Linux System Calls

d) Solaris: A UNIX based operating system designed by Sun Microsystems. Modern Version of Sun operating system is Solaris 2.x conforms to SunOS 5.x. Solaris is an AT&T System V type of UNIX. This version of UNIX is much different than SunOS 4.x.

i. Solaris Commandsii. Solaris Threads

iii. Solaris IPCiv. Solaris Socket Programming

v. Solaris System Calls

31

Page 32: Understanding Technologies

e) Aix : AIX is an open operating system from IBM that is based on a version of Unix.

f) HP- UX: HP-UX is the UNIX-based operating system for the HP 9000 series of business servers from Hewlett-Packard.

g) RTOS (Real Time Operating System)i. Vxworks

ii. Windows CE

h) Sambian: Symbian is a mobile operating system (OS) targeted at mobile phones that offers a high-level of integration with communication and personal information management (PIM) functionality

i) Bulnexj) Macintosh

2. Networking: A network is comprised of two or more PCs connected so that they can communicate and share resources. The terms network administrator, network specialist and network analyst designate job positions of engineers involved in computer networks, the people who carry out network administration.

Network administrators are basically the network equivalent of system administrator: they maintain the hardware and software that comprises the network.

Roles & Responsibility:

Coordinates voice network services, technical and maintenance support with outside service providers.

Knowledge of methods for detecting and preventing intrusion and other security. Knowledge of current computer, component, and network security systems and

techniques in order to prepare for such attacks and recommend new techniques and methods of defense.

Maintains records and prepares periodic and special reports of work performed. Creates and maintains logical, physical and protocol maps, and backups of all

equipment configurations parameters, and network documentation standards and procedures.

Server maintenance and troubleshooting. Including software security patches, operating system updates, performance tuning, log analysis, maintaining application updates and bug fixes.

Develops communication network plans and strategies including standards.

32

Page 33: Understanding Technologies

Conducts customer needs assessments; analyzes costs; develops project plans for communications projects.

Assists in short and long term planning including the development of strategic plans to leverage emerging technology to support future enterprise needs.

Meets with vendors to hold product demonstrations and to resolve communication network issues.

Provides training to technical support staff in various voice and/or data communication network functions.

Ensure prescribed service-quality objectives are met Knowledge of network and communication hardware and software

Knowledge of operating systems and applications and specific methods of network communication.

Identifies and resolves complex voice and/or data communications network problems; arranges for vendor support if necessary.

Understanding of IP Telephony and Voice Over IP solutions with respect to network design characteristics and related configuration of hardware and end-user devices.

Understanding and integration of Metropolitan Area Networks (MAN). Understanding the use of point-to-point radio communications to extend Ethernet

network infrastructure to outlying enterprise properties. Understanding of video and surveillance technology . Plans, defines, designs, develops, coordinates, and implements voice and/or data

communication systems. Knowledge of troubleshooting or problem solving, must be able to quickly and

correctly diagnose problem and know how best to fix it and communicate the solution to either co-workers or end-users effectively.

Understand the behavior of software, hardware and systems in relation to their use of communications technology in order to deploy solutions and to troubleshoot problems.

Understanding of radio frequency regulations and spectrums. Strong understanding of radio communications including packet transfer over

radio frequencies. Maintains current knowledge of technology and voice and data communications

systems. Knowledge of the purposes and methods of communication used by employees,

servers and systems.

Types of Networking:

a) Networking Protocols: A networking protocol is software that provides a

communication gateway (link) allowing the exchange of data between various networking systems.

33

Page 34: Understanding Technologies

i. SNMPii. TCP IP

iii. UDPiv. SS7v. Protocols All Others

b) Networking Administrationc) Networking Security

3. Databases: Systematic organization of information that is readable by computer. A computer database is a structured collection of records or data that is stored in a computer system so that a computer program or person using a query language can consult it to answer queries. The records retrieved in answer to queries are information that can be used to make decisions.

The computer program used to manage and query a database is known as a database management system. The properties and design of database systems are included in the study of information science.

Roles & Responsibility: Recoverability - Creating and testing Backups Integrity - Verifying or helping to verify data integrity Security - Defining and/or implementing access controls to the data Availability - Ensuring maximum uptime Performance - Ensuring maximum performance given budgetary constraints Development and testing support - Helping programmers and engineers to

efficiently utilize the database. Installation of new software - It is primarily the job of the DBA to install new

versions of DBMS software, application software, and other software related to DBMS administration.

Configuration of hardware and software with the system administrator - DBA must work closely with the system administrator to perform software installations, and to configure hardware and software so that it functions optimally with the DBMS.

Security administration - One of the main duties of the DBA is to monitor and administer DBMS security. This involves adding and removing users, administering quotas, auditing, and checking for security problems.

Data analysis - Analyzing the data stored in the database and to make recommendations relating to performance and efficiency of that data storage. This might relate to the more effective use of indexes, enabling "Parallel Query" execution, or other DBMS specific features.

Database design (preliminary) - The DBA knows the DBMS and system, can point out potential problems, and can help the development team with special performance considerations.

Data modeling and optimization - By modeling the data, it is possible to optimize the system layout to take the most advantage of the I/O subsystem.

34

Page 35: Understanding Technologies

Types of Database:

a) Oracle: An Oracle database consists of a collection of data managed by an Oracle database management system.

i. Oracle Generalii. OCI

iii. Architectureiv. Oracle Securityv. Database Management

vi. Forms Reports

Oracle RDBMS release timeline

1978 : Oracle version 1 1979 : Oracle version 2 1982 : Oracle version 3 1984 : Oracle version 4 1986 : Oracle version 5 1989 : Oracle version 6 1993 : Oracle version 7 1997 : Oracle version 8 1999 : Oracle version 8i 2001 : Oracle version 9i 2003 : Oracle version 10g 2007 : Oracle version 11g

b) SQL Server: Structured Query Language Server (SQL). SQL Server is a DBMS system provided by Microsoft. SQL Server is sometimes mistakenly referred to as SQL.

c) MS Access: Microsoft's advanced database creation and maintenance software. It is very popular among developers who maintain online databases.

d) MY SQL: MySQL is a multithreaded, multi-user SQL database management system (DBMS). The basic program runs as a server providing multiuser access to a number of databases.

e) Sybase: Sybase Inc. is a software company that produces products and services related to information management, enterprise mobility, mobile messaging, data warehousing and analytics, and development tools. A Relational Database Management System provided by Sybase Inc.

f) DB2 : DB2 is a family of relational database management system (RDBMS) products from IBM that serves a number of different operating system platforms.

35

Page 36: Understanding Technologies

g) Teradata: Teradata Corporation (NYSE: TDC) is a hardware and software vendor specializing in data warehousing and analytic technologies. Teradata was formerly a division of NCR Corporation, the largest company in Dayton, Ohio.

h) Postgre: Postgre SQL is an object-relational database management system. It is released under a BSD-style license and is thus free software. As with many other open-source programs, PostgreSQL is not controlled by any single company, but relies on a global community of developers and companies to develop it.

i) SAP Basis: A set of middleware programs and tools that provide the underlying base that enable applications to be interoperable across operating

systems. SAP Basis includes a RDBMS, GUI, and client server architecture. Basis is a business application software integrated solution. Simply, Basis is the administration of the SAP system.

j) DB Architecturek) DB Administrationl) DB Development

36

Page 37: Understanding Technologies

Testing :-Testing is a process used to help identify the correctness, completeness and quality of developed computer software. With that in mind, testing can never completely establish the correctness of computer software

Software Testing Life Cycle:

The test development life cycle contains the following components: Requirements Use Case Document Test Plan Test Case Test Case execution Report Analysis Bug Analysis Bug Reporting

Levels of testing Unit Testing:tests the minimal software component, or module. Each unit (basic component) of the software is tested to verify that the detailed design for the unit has been correctly implemented. In an Object-oriented environment, this is usually at the class level, and the minimal unit tests include the constructors and destructors.

Integration testing:-> exposes defects in the interfaces and interaction between integrated components (modules). Progressively larger groups of tested software components corresponding to elements of the architectural design are integrated and tested until the software works as a system.

Functional testing tests at any level (class, module, interface, or system) for proper functionality as defined in the specification.

System testing tests a completely integrated system to verify that it meets its requirements. System integration testing verifies that a system is integrated to any external or third party systems defined in the system requirements.

Acceptance testing: can be conducted by the end-user, customer, or client to validate whether or not to accept the product. Acceptance testing may be performed as part of the hand-off process between any two phases of development. See also software release life cycle Alpha testing is simulated or actual operational testing by potential users/customers or an

independent test team at the developers' site. Alpha testing is often employed for off-the-shelf software as a form of internal acceptance testing, before the software goes to beta testing.

Beta testing comes after alpha testing. Versions of the software, known as beta versions, are released to a limited audience outside of the company. The software is released to groups of people so that further testing can ensure the product has few faults or bugs. Sometimes, beta versions are made available to the open public to increase the feedback field to a maximal number of future users.

Software Testing Types:Static Testing :- The Verification activities fall into the category of Static Testing. During static testing, you have a checklist to check whether the work you are doing is going as per the set standards of the organization. These standards can be for Coding, Integrating and Deployment. Review's, Inspection's and Walkthrough's are static testing methodologies.

37

Page 38: Understanding Technologies

Dynamic Testing : Dynamic Testing involves working with the software, giving input values and checking if the output is as expected. These are the Validation activities. Unit Tests, Integration Tests, System Tests and Acceptance Tests are few of the Dynamic Testing methodologies. As we go further, let us understand the various Test Life Cycle's and get to know the Testing Terminologies. To understand more of software testing, various methodologies, tools and techniques, you can download the Software Testing Guide Book from here.

Blackbox Testing :- Introduction Black box testing attempts to derive sets of inputs that will fully exercise all the functional requirements of a system. It is not an alternative to white box testing. This type of testing attempts to find errors in the following categories:incorrect or missing functions, 2. interface errors, 3. errors in data structures or external database access,4. performance errors, and 5. initialization and termination errors.

Tests are designed to answer the following questions:1. How is the function's validity tested?2. What classes of input will make good test cases?3. Is the system particularly sensitive to certain input values?4. How are the boundaries of a data class isolated?5. What data rates and data volume can the system tolerate?6. What effect will specific combinations of data have on system operation?Equivalence PartitioningBoundary Value AnalysisCause-Effect Graphing TechniquesRecommended Resources

Whitebox Testing.à White box testing is a test case design method that uses the control structure of the procedural design to derive test cases. Test cases can be derived thatThe Nature of Software Defects Basis Path Testing Flow Graphs The Basis Set Deriving Test Cases omating Basis Set Derivation Loop Testing Simple Loops Nested Loops Concatenated Loops Unstructured Loops Recommended Resources

Unit Testing.:- In computer programming, a unit test is a method of testing the correctness of a particular module of source code. The idea is to write test cases for every non-trivial function or method in the module so that each test case is separate from the others if possible. This type of testing is mostly done by the developers.

Requirements Testing. :Usage:To ensure that system performs correctlyTo ensure that correctness can be sustained for a considerable period of time.System can be tested for correctness through all phases of SDLC but incase of reliability the programs should be in place to make system operational.

38

Page 39: Understanding Technologies

Objective: Successfully implementation of user requirements,/li> Correctness maintained over considerable period of time Processing of the application

complies with the organization’s policies and procedures. Secondary users needs are fulfilled: Security officer DBA Internal auditors Record retention Comptroller

Regression Testing.:-

Objective:Successfully implementation of user requirements,/li> Correctness maintained over considerable period of time Processing of the application complies with the organization’s policies and procedures.Secondary users needs are fulfilled: Security officer DBA Internal auditors Record retention Comptroller

Error Handling Testing. Usage:->

It determines the ability of applications system to process the incorrect transactions properly Errors encompass all unexpected conditions. In some system approx. 50% of programming effort will be devoted to handling error

condition. Determine Application system recognizes all expected error conditions Determine Accountability of processing errors has been assigned and procedures provide a

high probability that errors will be properly corrected Determine During correction process reasonable control is maintained over errors.

Manual support Testing.- Usage: It involves testing of all the functions performed by the people while preparing the data and

using these data from automated system. Verify manual support documents and procedures are correct. Determine Manual support responsibility is correct Determine Manual support people are adequately trained. Determine Manual support and automated segment are properly interfaced.

Intersystem Testing. -Usage:

To ensure interconnection between application functions correctly. Determine Proper parameters and data are correctly passed between the applications Documentation for involved system is correct and accurate. Ensure Proper timing and coordination of functions exists between the application system.

Control Testing. - Usage:

39

Page 40: Understanding Technologies

Control is a management tool to ensure that processing is performed in accordance to what management desire or intents of management.

Accurate and complete data Authorized transactions Maintenance of adequate audit trail of information. Efficient, effective and economical process. Process meeting the needs of the user.

Parallel Testing. - Usage:

To ensure that the processing of new application (new version) is consistent with respect to the processing of previous application version.

Conducting redundant processing to ensure that the new version or application performs correctly.

Demonstrating consistency and inconsistency between 2 versions of the application.

Volume Testing: .Whichever title you choose (for us volume test) here we are talking about realistically exercising an application in order to measure the service delivered to users at different levels of usage. We are particularly interested in its behavior when the maximum number of users are concurrently active and when the database contains the greatest data volume. The creation of a volume test environment requires considerable effort. It is essential that the correct level of complexity exists in terms of the data within the database and the range of transactions and data used by the scripted users, if the tests are to reliably reflect the to be production environment. Once the test environment is built it must be fully utilised. Volume tests offer much more than simple service delivery measurement. The exercise should seek to answer the following questions:What service level can be guaranteed. How can it be specified and monitored?

Stress Testing.The purpose of stress testing is to find defects of the system capacity of handling large numbers of transactions during peak periods. For example, a script might require users to login and proceed with their daily activities while, at the same time, requiring that a series of workstations emulating a large number of other systems are running recorded scripts that add, update, or delete from the database.

Performance Testing:

System performance is generally assessed in terms of response time and throughput rates under differing processing and configuration conditions. To attack the performance problems, there are several questions should be asked first:

How much application logic should be remotely executed? How much updating should be done to the database server over the network from the client

workstation? How much data should be sent to each in each transaction? According to Hamilton [10], the performance problems are most often the result of the client

or server being configured inappropriately. The best strategy for improving client-sever performance is a three-step process [11]. First,

execute controlled performance tests that collect the data about volume, stress, and loading tests. Second, analyze the collected data. Third, examine and tune the database queries and, if necessary, provide temporary data storage on the client while the application is executing.

40

Page 41: Understanding Technologies

Testing Tools:

Win Runner - WinRunner, Mercury Interactive’s enterprise functional testing tool. It is used to quickly create and run sophisticated automated tests on your application. Winrunner helps you automate the testing process, from test development to execution. You create adaptable and reusable test scripts that challenge the functionality of your application. Prior to a software release, you can run these tests in a single overnight run- enabling you to detect and ensure superior software quality.

Load Runner: Load Runner is divided up into 3 smaller applications:The Virtual User Generator allows us to determine what actions we would like our Vusers, or virtual users, to perform within the application. We create scripts that generate a series of actions, such as logging on, navigating through the application, and exiting the program.The Controller takes the scripts that we have made and runs them through a schedule that we set up. We tell the Controller how many Vusers to activate, when to activate them, and how to group the Vusers and keep track of them. The Results and Analysis program gives us all the results of the load test in various forms. It allows us to see summaries of data, as well as the details of the load test for pinpointing problems or bottlenecks.

Test Director :- TestDirector, the industry’s first global test management solution, helps organizations deploy high-quality applications more quickly and effectively. Its four modules Requirements, Test Plan, Test Lab, and Defects are seamlessly integrated, allowing for a smooth information flow between various testing stages. The completely Web-enabled TestDirector supports high levels of communication and collaboration among distributed testing teams, driving a more effective, efficient global application-testing process

Silk Test -Silk Test is a tool specifically designed for doing REGRESSION AND FUNCTIONALITY testing. It is developed by Segue Software Inc. Silk Test is the industry’s leading functional testing product for e-business applications, whether Window based, Web, Java, or traditional client/server-based. Silk Test also offers test planning, management, direct database access and validation, the flexible and robust 4Test scripting language, a built in recovery system for unattended testing, and the ability to test across multiple platforms, browsers and technologies.You have two ways to create automated tests using silktest:

Use the Record Testcase command to record actions and verification steps as you navigate through the application.

Write the testcase manually using the Visual 4Test scripting language.

QTP Tools (Quick test professional): Mercury QuickTest Professional provides the industry’s best solution for functional test and regression test automation - addressing every major software application and environment. QuickTest Professional (QTP) is a software test tool for functional and regression test automation. The software provides a keyword-driven approach, where test automation experts have full access to the underlying test and object properties, via an integrated scripting and debugging environment

41

Page 42: Understanding Technologies

Role & Responsibility: Preparing the test cases using Functional Requirement Specifications. Review of test cases. Execution test cases. Involved in preparation of Test plan. Performed Sanity and System testing. Performed Retesting and Regression testing. Reporting the Bugs using Quality Center tool. Preparing test scripts for Quick Test Professional. Coordinated with developers for defect resolving using Quality Center. Prepared the Test Report. Manual Testing. Test Case Design. Test Case Review. Test Case Execution. Functionality Testing. Defect Tracking. Provides updates to management on testing process status. Preparing of Test Case using UCD. Reviewing of Test Cases. Execution of Test Cases. Conducted Regression, Functional Testing. Coordinated with developers for defect resolving using Test director. Logging the bugs in Test Director. Worked in test cases preparation. Executed test cases manually to verify the system functionality as per user requirement. Defects Logged using Excel sheet. Analyzing the Business requirements and Functional Documents.

Maintaining the existing system.

Workflow design for the EAI by bug fixing and seeing the Data is in sync.

Preparing Test Scripts for Framework. Executing Automation Test scripts Involved in Functional & Regression testing. Defect Tracking and Reporting.

42

Page 43: Understanding Technologies

Core Java: ->Java refers to a number of computer software products and specifications from Sun Microsystems that together provide a system for developing application software and deploying it in a cross-platform environment.These are component of core Java JVM OOPS Concept Emption Handling Inheritance Factory Method

JVM: A Virtual Machine is a like a computer running within a computer. Although slightly slower than running pure Machine Code, this offers greater portability as well as robustness and reliability. Java was the first mainstream language to use a Virtual Machine - known as the JVM or Java Virtual Machine. Java Compilers generate ByteCode which runs on the JVM.

OOPS Concept: OOPs is the new concept of programming ,parallel to Procedure oriented programming.It were intorduced in late 80's.It consider the programming simulated to real world objects.It help in programming approach in order to built robust,user friendly and efficient softwares and provide the efficient way to maintain real world softwares.

Emption Handling: Exception handling is a programming language construct or computer

hardware mechanism designed to handle the occurrence of some condition that changes the

normal flow of execution. The condition is called an exception. Exceptions are normally

recommended to be used only for signaling error (exceptional) conditions. For signaling

conditions that are part of the normal flow of execution see the concepts of signal and event

handler.

Inheritance: Generally speaking, objects are defined in terms of classes. You know a lot about an object by knowing its class. Even if you don't know what a penny-farthing is, if I told you it was a bicycle, you would know that it had two wheels, handle bars, and pedals.

Object-oriented systems take this a step further and allow classes to be defined in terms of other

classes. For example, mountain bikes, racing bikes, and tandems are all different kinds of

bicycles. In object-oriented terminology, mountain bikes, racing bikes, and tandems are all

subclasses of the bicycle class. Similarly, the bicycle class is the superclass of mountain bikes,

racing bikes, and tandems.

Factory Method: The factory method pattern is an object-oriented design pattern. Like other

creational patterns, it deals with the problem of creating objects (products) without specifying the

exact class of object that will be created. The factory method design pattern handles this problem

by defining a separate method for creating the objects, which subclasses can then override to

specify the derived type of product that will be created. More generally, the term factory method is

often used to refer to any method whose main purpose is creation of objects.

43

Page 44: Understanding Technologies

J2EE:- Short for Java 2 Platform Enterprise Edition. J2EE is a platform-independent, Java-centric environment from Sun for developing, building and deploying Web-based enterprise applications online. The J2EE platform consists of a set of services, APIs, and protocols that provide the functionality for developing multitiered, Web-based applications. Some of the key features and services of J2EE: At the client tier, J2EE supports pure HTML, as well as Java applets or applications. It relies

on Java Server Pages and servlet code to create HTML or other formatted data for the client. Enterprise JavaBeans (EJBs) provide another layer where the platform's logic is stored. An

EJB server provides functions such as threading, concurrency, security and memory management. These services are transparent to the author.

Java Database Connectivity (JDBC), which is the Java equivalent to ODBC, is the standard interface for Java databases.

The Java servlet API enhances consistency for developers without requiring a graphical user interface.

JSP: ->Java Server Pages or JSP for short is Sun's solution for developing dynamic web sites. JSP provide excellent server side scripting support for creating database driven web applications

JSF:-> JavaServer Faces technology simplifies building user interfaces for JavaServer applications. Developers of various skill levels can quickly build web applications by: assembling reusable UI components in a page; connecting these components to an application data source; and wiring client-generated events to server-side event handlers.  

JDBC:-> JDBC is Java application programming interface that allows the Java programmers to access database management system from Java code. It was developed by JavaSoft, a subsidiary of Sun Microsystems. 

ODBC: ODBC (Open Data Base Connectivity) is a function library which provides a common API (Application Programming Interface) for ODBC compliant database management systems.ODBC was developed by the SQL Access Group in 1992.

ODBC operates as an industry-standard "shim" between applications which utilize databases and the databases themselves.

If an application is developed using ODBC, the application will be able to store data in any database management system which is

Framework:

Struts:   Struts Frame work is the implementation of Model-View-Controller (MVC) design pattern for the JSP. Struts is maintained as a part of Apache Jakarta project and is open source. Struts Framework is suited for the application of any size. We are using jakarta-struts-1.1 and jakarta-tomcat-5.0.4 for this tutorial.

Hibernate: Hibernate is a free, open source Java package that makes it easy to work with relational databases. Hibernate makes it seem as if your database contains plain Java objects like you use every day, without having to worry about how to get them out of (or back into) mysterious database tables. It liberates you to focus on the objects and features of your application, without having to worry about how to store them or find them later.

44

Page 45: Understanding Technologies

Spring: Spring is a lightweight container, with wrappers that make it easy to use many different services and frameworks. Lightweight containers accept any JavaBean, instead of specific types of components. "Spring is more than just a 'lightweight container,'" says Justin Gehtland. "It allows Java

developers who are building J2EE apps to get to the heart of their real domain problems and stop spending so much time on the minutiae of providing services to their domain." Gehtland and Bruce Tate are coauthors of Spring: A Developer's Notebook, a no-nonsense book that will get you up to speed quickly on the new Spring open source framework. Spring: A Developer's Notebook includes examples and practical applications that demonstrate exactly how to use Spring, in ten chapters of code-intensive labs.

Java has three Editors:

Eclipse: Eclipse is a platform that has been designed from the ground up for building integrated web and application development tooling. By design, the platform does not provide a great deal of end user functionality by itself. The value of the platform is what it encourages: rapid development of integrated features based on a plug-in model. Eclipse provides a common user interface (UI) model for working with tools.  It is designed to

run on multiple operating systems while providing robust integration with each underlying OS.  Plug-ins can program to the Eclipse portable APIs and run unchanged on any of the supported operating systems. 

At the core of Eclipse is an architecture for dynamic discovery, loading, and running of plug-ins. The platform handles the logistics of finding and running the right code. The platform UI provides a standard user navigation model.  Each plug-in can then focus on doing a small number of tasks well. What kinds of tasks? Defining, testing, animating, publishing, compiling, debugging, diagramming...the only limit is your imagination.

JBOSS: ---> JBoss is an application server program for use with Java 2 Platform, Enterprise Edition (J2EE) and Enterprise Java Beans (EJB). JBoss is similar to proprietary programs such as BEA WebLogic and IBM WebSphere JBoss is freely available under the GNU Lesser General Public License (LGPL). A corporation known as the JBoss Group, based in Atlanta, Georgia, provides support for JBoss. The JBoss Group fixes bugs in the JBoss program free of charge, but bills for custom features and consulting services

W-SAD: IBM WebSphere Studio Application Developer is a comprehensive integrated development environment for visually designing, constructing, testing and deploying Web services, portals and Java 2 Enterprise Edition (J2EE) applications. WebSphere Studio Application Developer accelerates J2EE development with a complete set of high productivity tools, templates and wizards.

Swing :-> Swing is a widget toolkit for Java. It is part of Sun Microsystems' Java Foundation Classes (JFC) — an API for providing a graphical user interface (GUI) for Java programs. Swing includes GUI widgets such as text boxes, buttons, split-panes, and tables.

JMS:-> The Java Message Service (JMS) defines the standard for reliable Enterprise Messaging. Enterprise messaging, often also referred to as Messaging Oriented Middleware (MOM), is universally recognized as an essential tool for building enterprise applications. By combining Java technology with enterprise messaging, the JMS API provides a powerful tool for solving enterprise computing problems. 

45

Page 46: Understanding Technologies

Servlets: Servlets are the Java platform technology of choice for extending and enhancing Web servers. Servlets provide a component-based, platform-independent method for building Web-based applications, without the performance limitations of CGI programs. And unlike proprietary server extension mechanisms (such as the Netscape Server API or Apache modules), servlets are server- and platform-independent. This leaves you free to select a "best of breed" strategy for your servers, platforms, and tools.

Ajax--> Ajax (Asynchronous JavaScript and XML) is a method of building interactive applications for the Web that process user requests immediately. Ajax combines several programming tools including JavaScript, dynamic HTML (DHTML), Extensible Markup Language (XML), cascading style sheets (CSS), the Document Object Model (DOM), and the Microsoft object, XMLHttpRequest. Ajax allows content on Web pages to update immediately when a user performs an action, unlike an HTTP request, during which users must wait for a whole new page to load. For example, a weather forecasting site could display local conditions on one side of the page without delay after a user types in a zip code

Coldfusion:--> ColdFusion, developed by Allaire, is a complete Web application server for

developing and delivering scalable e-business applications. The ColdFusion solution consists of

two related packages:

ColdFusion Studio – Tightly integrated with ColdFusion Server, ColdFusion Studio provides

visual programming, database, and debugging tools for building sophisticated Web applications.

ColdFusion Server – ColdFusion Server offers all the runtime services for delivering your e-

business applications built on a highly scalable and open architecture.

EJB :-> Enterprise Java Beans:There are two types of EJBs. They areSession Beans and Entity BeansSession Beans:Each Session Bean is usually associated with one EJB Client. Each Session Bean is created and destroyed by the particular EJB Client that it is associated with. A Session Bean can either have states or they can be stateless. However, Session Beans do not survive a System shutdown.Entity Beans:Entity Beans always have states. Each Entity Bean may however be shared by multiple EJB Clients. Their states can be persisted and stored across multiple invocations. Hence they can survive System ShutdownsStateless Session Beans:These types of EJBs have no internal state. Since they do not have any states, they need not be passivated. Because of the fact that they are stateless, they can be pooled in to service multiple clients (remember MTS components?)

Stateful Session Beans:These types of EJBs possess internal states. Hence they need to handle Activation and Passivation. However, there can be only one Stateful Session Bean per EJB Client. Since they can be persisted, they are also called Persistent Session Beans. These types of EJBs can be saved and restored across client sessions. To save, a call to the bean's getHandle() method returns a handle object. To restore, call the handle object's getEJBObject() method.

46

Page 47: Understanding Technologies

Server:

Weblogic:-> BEA Systems' WebLogic is a server software application that runs on a middle tier, between back-end databases and related applications and browser-based thin clients. WebLogic is a leading e-commerce online transaction processing (OLTP) platform, developed to connect users in a distributed computing environment and to facilitate the integration of mainframe applications with distributed corporate data and applications The main features of WebLogic server include connectors that make it possible for any

legacy application on any client to interoperate with server applications, Enterprise JavaBean (EJB) components, resource pooling, and connection sharing that make applications very scalable. An administration console with a user interface makes management tasks more efficient and features such as Secure Sockets Layer (SSL) support for the encryption of data transmissions, as well as authentication and authorization mechanisms, make applications and transactions secure.

 

Websphere: WebSphere is a set of Java-based tools from IBM that allows customers to create and manage sophisticated business Web sites. The central WebSphere tool is the WebSphere Application Server (WAS), an application server that a customer can use to connect Web site users with Java applications or servlets. Servlets are Java programs that run on the server rather than on the user's computer as Java applets do. Servlets can be developed to replace traditional common gateway interface (CGI) scripts, usually written in C or Practical Extraction and Reporting Language, and run much faster because all user requests run in the same process space.Tomcat: Tomcat is an application server from the Apache Software Foundation that executes Java

servlets and renders Web pages that include Java Server Page coding. Described as a "reference implementation" of the Java Servlet and the Java Server Page specifications, Tomcat is the result of an open collaboration of developers and is available from the Apache Web site in both binary and source versions. Tomcat can be used as either a standalone product with its own internal Web server or together with other Web servers, including Apache, Netscape Enterprise Server, Microsoft Internet Information Server (IIS), and Microsoft Personal Web Server. Tomcat requires a Java Runtime Enterprise Environment that conforms to JRE 1.1 or later.

Tomcat is one of several open source collaborations that are collectively known as Jakarta.

Use of Java: Desktop use Mobile devices Web server and enterprise use

Advanced Java  :->Java Advanced Imaging (JAI) is a Java platform extension API that provides a set of object-oriented interfaces that support a simple, high-level programming model which allows developers to create their own image manipulation routines without the additional cost or licensing restrictions, associated with commercial image processing software.

Java Technology: Group-1        (Web-Server  &  support Technologies ) JDBC   (  Java Database Connectivity) Servlets JSP       (Java Server Pages)

47

Page 48: Understanding Technologies

Java Mail

Group-2       ( Distributed-Objects Technologies) RMI            (Remote Method Invocation) Corba-IDL   ( Corba-using Java  with OMG-IDL) RMI-IIOP     (Corba in Java without OMG-IDL) EJB             (Enterprise Java Beans)

Group-3  (  Supporting & Advanced Enterprise technologies) NDI         ( Java Naming & Directory Interfaces) JMS         ( Java Messaging Service) JAVA-XML  ( such as JAXP, JAXM, JAXR, JAX-RPC, JAXB, and XML-WEB SERVICE) Connectors ( for ERP and Legacy systems).

Java Networking :--> Overview of Networking has two sections. The first describes the networking capabilities of the Java platform that you may already be using without realizing that you are using the network. The second provides a brief overview of networking to familiarize you with terms and concepts that you should understand before reading how to use URLs, sockets, and datagrams. Working With URLs All About Sockets All About Datagrams rammatic Access to Network Parameters Working With Cookies Security considerations: 

Role & Responsibility: -- >

Implementing MVC and DAO design patterns based on the Struts framework with Java

Servlets and JSPs and access MySQL database with JDBC.

Validations using Struts frame work.

Writing various XML files for configuring webcontainer (web.xml) and Application Server

container (Deployment Descriptor).

Creating pages by using JSPs. The business logic is developed using J2EE framework and deployed components on

Tomcat Server where MyEclipse is used for component building.

Servlets and JSP t run on OC4J Server and access SQL Server database with JDBC.

Creating Java Bean components.

Writing various XML files for configuring webcontainer (web.xml) and Application Server

container (Deployment Descriptor).

Transferring data by using XMLs from UI to DB vice versa.

Creating pages by using JSPs and servelts. The business logic is developed using J2EE framework and deployed components on

BEA OC4J Application Server where JDeveloper10.1.3 is used for component building Designed and developed user interfaces using Swings.(Note: Used Grid Bag Layouts

Completely) Used design patterns like MVC JDBC to interact with back-end database.

48

Page 49: Understanding Technologies

Handle View, Controller and Model independently. Helping The Team Members Handling the Hotel Information, Admin & Corporate Travel Modules. Involving in Developments of Server Side components using JSF. Involving in Developments of Ajax Components. Involving in development and deployment of JSF and Spring components to analyze the

troubleshooting and Debugging Involved in coding with Struts Frame work. Participated in coding for Action classes, Jsps. Involved in Module Integration. Involved in coding of validating the forms. Involved in coding of validating the forms

Roles & Responsibilities of a SOFTWARE DEVELOPER Analysis of the specifications provided by the client’s. Module Design and development,Documentation,coding. Successfully Implementation of Configuration Management Tool. Successfully Implementation of Install Shield for product Autoload E. Successfully Implementation of Printer Library for Server side processes. Data collection through the use of Remoting Technology of .NET

framework. Application deployment, Commissioning and training end user Technical support and troubleshooting problems at site

Web Technology (.Net)

. Net - Description

.NET is the Microsoft Web services strategy to connect information, people, systems, and devices through software. Integrated across the Microsoft platform, .NET technology provides the ability to quickly build, deploy, manage, and use connected, security-enhanced solutions with Web services. .NET-connected solutions enable businesses to integrate their systems more rapidly and in a more agile manner and help them realize the promise of information anytime, anywhere, on any device.

The infrastructure of the .NET platform from Microsoft includes the Common Language Runtime (CLR) and .NET Framework class library. The CLR provides the environment for running .NET applications, and the class library provides the foundation services, including ASP.NET, ADO.NET, Windows Forms (for building GUIs) as well as classes for accessing COM services.

.NET Framework

The Microsoft .NET Framework is a software component that can be added to or is included with the Microsoft Windows operating system. It provides a large body of pre-coded solutions to common program requirements, and manages the execution of programs written specifically for the framework. The .NET Framework is a key

49

Page 50: Understanding Technologies

Microsoft offering, and is intended to be used by most new applications created for the Windows platform.

Common Language Runtime

The most important component of the .NET Framework lies in the Common Language Infrastructure, or CLI. The purpose of the CLI is to provide a language-agnostic platform for application development and execution, including, but not limited to, components for exception handling, garbage collection, security, and interoperability. Microsoft's implementation of the CLI is called the Common Language Runtime, or CLR.VersionsMicrosoft started development on the .NET Framework in the late 90s originally under the name of Next Generation Windows Services (NGWS). By late 2000 the first beta versions of .NET 1.0 were being released.

.NET Framework architecture

VersionsMicrosoft started development on the .NET Framework in the late 90s originally under the name of Next Generation Windows Services (NGWS). By late 2000 the first beta versions of .NET 1.0 were being released.

50

Page 51: Understanding Technologies

Version Name Version Number Release Date1.0 Beta 1 1.0.????.0 2000-111.0 Beta 2 1.0.2914.0 2001-06-201.0 RTM 1.0.3705.0 2002-01-051.0 SP1 1.0.3705.209 2002-03-191.0 SP2 1.0.3705.288 2002-08-071.0 SP3 1.0.3705.6018 2004-08-31

1.1 RTM 1.1.4322.573 2003-04-011.1 SP1 1.1.4322.2032 2004-08-30

1.1 SP1 (Windows Server 2003 Version) 1.1.4322.2300 2005-03-302.0 RTM 2.0.50727.42 2005-11-073.0 RTM 3.0.4506.30 2006-11-063.5 Beta 1 3.5 2007-04-27

.NET Framework 1.0

This is the first release of the .NET Framework that was released on February 13, 2002. It is available on its own as a redistributable package or in a software development kit

.NET Framework 1.1

This is the first major .NET Framework upgrade. It is available on its own as a redistributable package or in a software development kit, and was published April 3, 2003.

.NET Framework 2.0

Released with Visual Studio .NET 2005, Microsoft SQL Server 2005 and BizTalk 2006.

.NET Framework 3.0

.NET Framework 3.0, formerly called WinFX, includes a new set of managed code APIs that are an integral part of Windows Vista and Windows Server "Longhorn" operating systems

.NET Framework 3.5This version will include a new compiler that will support new features such as Language Integrated Query (LINQ), as well as new language features in C# and VB.NET. This version of the framework, containing version 3.0 of the CLR (as opposed to CLR 2.0 in .NET Framework 3.0), is slated to be included with the version of Visual Studio following the 2005 release (Codenamed Orcas). Beta 1 was released on April 27 2007.

.Net Languages

51

Page 52: Understanding Technologies

.NET Languages are computer programming languages that are used to produce programs that execute within the Microsoft .NET Framework. Microsoft provides several such languages, including C#, Visual Basic .NET, and [[C++/CLI]].

Microsoft .Net Languages

C# - Microsoft's flagship .NET Framework language which bears similarities to the [[C++]] and Java languages.

Visual Basic .NET - A completely redesigned version of the Visual Basic language for the .NET Framework. This also includes Visual Basic 2005 (v8.0).

VBx, a dynamic version of Visual Basic .NET that runs on top of the Dynamic Language Runtime.

C++/CLI and the deprecated [Managed Extensions for C++ [Managed C++] - A managed version of the [[C++]] language.

J# - A Java and [[J++]] .NET transitional language.

JScript .NET - A compiled version of the JScript language.

Windows PowerShell - An interactive command line shell/scripting language which provides full access to the .NET Framework.

IronPython - A .NET implementation of the Python programming language developed by Jim Hugunin at Microsoft.

IronRuby - A dynamically compiled version of the Ruby programming language targeting the .NET Framework.

F#, a member of the ML programming language family.

52

Page 53: Understanding Technologies

ASP.NET is a web application framework marketed by Microsoft. Programmers can use it to build dynamic web sites, web applications and XML web services. It is part of Microsoft's .NET platform and is the successor to Microsoft's Active Server Pages (ASP) technology.

ASP.NET is built on the Common Language Runtime, meaning programmers can write ASP.NET code using any Microsoft .NET language.

ADO.NET is a set of computer software components that can be used by programmers to access data and data services. It is a part of the base class library that is included with the Microsoft .NET Framework. It is commonly used by programmers to access and modify data stored in relational database systems, though it can also be used to access data in non-relational sources. ADO.NET is sometimes considered an evolution of ActiveX Data Objects (ADO) technology, but was changed so extensively that it can be perceived as an entirely new product.

53