melbourne vmug, 9 december 2010
DESCRIPTION
VMWare at La Trobe. Melbourne VMUG, 9 December 2010 . Presenter. Peter Harms (RHCE) Unix Systems Team Leader La Trobe University, Bundoora VMWare Evangelist [email protected] 03 9479 5163. Presenter History. First used VMware in 2000 (Workstation v1) - PowerPoint PPT PresentationTRANSCRIPT
Melbourne VMUG, 9 December 2010
VMWare at La Trobe
VMWare at La Trobe Page 2
Presenter
Peter Harms (RHCE)Unix Systems Team LeaderLa Trobe University, Bundoora
VMWare Evangelist
[email protected] 9479 5163
VMWare at La Trobe Page 3
Presenter History- First used VMware in 2000 (Workstation v1)- Heavy user of Workstation v3 and v4 from 2000 – 2005- No exposure to GSX or ESX before starting at La Trobe
in 2006, took over ESX admin role in 2007- Now look after the entire VMware deployment at the
University
VMWare at La Trobe Page 4
VMWare at La Trobe Page 5
VMware “View” 10 years old?
In early 2001 a colleague and I developed a system using Workstation 3 and advanced snapshot management
REDO log files were manipulated in such a way that we could rapidly deploy applications and maintain user settings using VMware Workstation
The concept was presented to my then employer, promoting the benefits of using virtualisation in the classroom/lab environment
VMware “View” 10 years old?
VMWare at La Trobe Page 6
VMWare at La Trobe Page 7
VMware “View” 10 years old?
The process was completely automated and went into production soon after, and was still in use in 2006
The workstation and server specifications were quite low...
• Dual Pentium class CPU running at 750MHz or greater
• 512Mb RAM (1Gb preferred)
• 60Gb hard drive (minimum) for storing user settings between sessions
8
Server specifications
• Pentium class CPU running at 750MHz or greater
• 256 Mb RAM (512Mb RAM if running Windows 2000 Server)
• Quality video card (TNT2 class)
• 10Gb hard drive9
Workstation specifications
VMWare at La Trobe Page 10
Original La Trobe Environment
In 2006/2007 the ESX environment at La Trobe was small:• 3 ESX hosts• Ver 2.5 (maybe even earlier)• Dual core server CPU’s• 16 GB RAM• Few datastores• Few VM’s
VMWare at La Trobe Page 11
Current Environment
Over the last 4 years the environment has grown substantially:
• ESX Enterprise Plus – Ver 4.0 Update 2, transitioning to ESXi 4.1
• 2 vCentre servers• 33 Hosts in 3 datacentres• 438 Virtual Machines• 8 clusters• 78 SAN datastores• 38 networks
VMWare at La Trobe Page 12
Production Environment• 14 Hosts
– 10 HP BL 460c G6• Dual socket, quad core, 72GB RAM
– 4 IBM 3850 M2• Quad socket, hex core, 72GB RAM
• 200 Virtual Machines• 1 Datacentre• 3 Clusters
– 1 Linux, 2 Windows– HA and DRS– Site Recovery Manager
VMWare at La Trobe Page 13
Dev Environment• 15 Hosts
– 8 HP BL 460c G1• Dual socket, dual core, 32GB RAM
– 2 HP BL 480c G1• Dual socket, dual core, 16GB RAM
– 5 IBM x3655• Dual socket, dual core, 32GB RAM
• 160 Virtual Machines• 2 Datacentres• 4 Clusters
– 1 Linux, 2 Windows, 1 Test– HA and DRS
VMWare at La Trobe Page 14
DR Environment• 3 Hosts
– 3 HP BL 460c G1• Dual socket, dual core, 32GB RAM
• 4 production virtual machines sharing the hosts• 1 Datacentre• 1 Cluster
– HA– DRS– SRM– 8 datastores being replicated
VMWare at La Trobe Page 15
Current VM Workload• Public website www.latrobe.edu.au and Intranet
– Legacy and CMS– Application servers
• PHP, Perl, Tomcat, IIS– Database servers
• Postgres, MySQL, Oracle
• Student Management System– Physical v Virtual performance testing
• Learning Management System– WEBCT– Moodle
VMWare at La Trobe Page 16
Current VM Workload• DNS master and secondary nodes• Lectopia management node• Email gateways and quarantine DB• Listserv• Squid proxies• Windows File and Print• WINS• Monitoring
– Cacti
Physical v Virtual Testing
VMWare at La Trobe Page 17
Physical v Virtual Testing
Physical machine Specification: 12 Core, 32 GB of Ram
• Users supported: 85 VUs (CPUs peak >= 80% beyond 85 users)
6 VMs on the same hardware:• Users Supported: 150 VUs
VMWare at La Trobe Page 18
Physical v Virtual Testing
Conclusion: • VMware is more efficient in handling
number of users with better response times as compared to Physical hardware
VMWare at La Trobe Page 19
VMWare at La Trobe Page 20
Environment Considerations• CPU compatibility
– Essential for vMotion– Difficult to maintain over a 3 or 4 year hardware cycle
• RAM utilisation– Never seem to have enough RAM– Physical OS separation to maximise shared memory
• Storage– LUN sizing– VM grouping– Never seem to have enough storage
• Network
VMWare at La Trobe Page 21
Design Considerations• Cluster design
– To pool or not to pool– Physical separation– OS grouping– Host consistency
• Cluster Sizing– HA considerations– Time to evacuate– Impact of failed host– VM density
• Getting the right CPU/RAM balance• 8G per core
VMWare at La Trobe Page 22
Design Considerations• Monitoring
– Be prepared to defend the environment• Oh, it’s virtual – there’s your problem...• This application is not supported in a virtual environment
– Know what the environment is doing• Percent ready creep• Balloon blowouts
– Know how to produce performance reports• Confidence must be tangible
– Buy the right tools• vCentre does not do it all
VMWare at La Trobe Page 23
Performance Monitoring• VMWare AppSpeed• Quest vFoglight
VMWare at La Trobe Page 24
Performance Monitoring• VMWare AppSpeed
– Agent on selected ESX hosts• Connected to vSwitch
– Management VM– Plugin to vCentre– Monitors performance of applications
• Http• Mssql
– In depth analysis of database and web server performance– Reply Size, Latency, Hits and Throughput– Automatically discovers and categorizes traffic
VMWare at La Trobe Page 25
VMWare AppSpeed
VMWare at La Trobe Page 26
VMWare AppSpeed
VMWare at La Trobe Page 27
VMWare AppSpeed
VMWare at La Trobe Page 28
VMWare AppSpeed
VMWare at La Trobe Page 29
Performance Monitoring• Quest vFoglight
– Many automated reports– Performance history– Alarm explanation– Fault prediction– Capacity planning– Resource hungry!
VMWare at La Trobe Page 30
Quest vFoglight
VMWare at La Trobe Page 31
Quest vFoglight
VMWare at La Trobe Page 32
Quest vFoglight
VMWare at La Trobe Page 33
Performance Monitoring Examples
Performance Monitoring Examples
VMWare at La Trobe Page 34
VMWare at La Trobe Page 35
Environment Consistency• Concept of server “Classes” developed in 2009
– Standardise the server hardware– More flexible usage options– Better fault recovery
• Drop in replacement an option• Mobile datacentre possibilities
• 5 classes for blade and rack servers– Single socket, quad core, 6GB RAM to:– Dual socket, quad core, 72GB RAM– All with 300G internal disk mirror– Blades preferred
VMWare at La Trobe Page 36
Environment Consistency• Concept of server classes further developed in 2010
– More emphasis on ESX hardware– Anything less than a Class C will be virtual by default– Target is 90% Virtual within 12 months
• Sell the benefits of virtualisation– Low density hosts are worth it– DR made easy– Cut the hardware ties– Application mobility and uptime
VMWare at La Trobe Page 37
High Density Virtualisation• Virtualisation Challenges
– Large Enterprise Applications • SAP• SQL/Oracle farms
– Domain Controllers– Desktop Virtualisation– Patching un-clustered high availability servers– True high availability– Fault Tolerance
Low Density Hardware
VMWare at La Trobe Page 38
Medium Density Hardware
VMWare at La Trobe Page 39
High Density Hardware
VMWare at La Trobe Page 40
VMWare at La Trobe Page 41
Next Environment• CISCO UCS being deployed
– 15 B200 M2• Dual socket, hex core, 96GB RAM• General purpose ESX – high density low spec
– 10 B200 M2• Dual socket, hex core, 48GB RAM• Specific purpose ESX – low density high spec
– 3 C200 M2• Dual socket, hex core, 48GB RAM
VMWare at La Trobe Page 42
The Cloud• Was a cloud sceptic, now a convert!• Private Cloud with vCloud Director
– Centralised IT management– Remove the “I need access to the console” problem– Delivering true “Infrastructure as a Service”– “Virtualise” the ESX hosts
VMWare at La Trobe Page 43
Questions
Example presentation title Page 44
Thank You