edit condition condition archivedcondition · 2020-02-20 · use cases for scalability what users...

16
Edit Condition Condition Condition

Upload: others

Post on 26-May-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

Edit

Condition Condition ArchivedCondition

Page 2: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

Contents

Part One: The Scalable Automation Engine.........................................................................................3

Part Two: The Rich Content Library……..................................................................................................6

Part Three: The Intuitive Workflow Designer.....................................................................................10

Page 3: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

Scalability has never been so important to enterprise IT. As businesses deal with more processes, fewer financial and human resources, and greater volumes of data than ever before, the ability to quickly and cost-effectively adjust to changing demands has become an essential element of IT operations. Many factors are behind this need. Big Data, cloud computing, and the Internet of Things are making complex processes both possible and necessary. IT organizations often run thousands of processes—sometimes millions—every day. Even for those firms running a limited number of jobs, the complexity of those tasks often involves a large and/or variable number of IT resources. Another dynamic at work is that processes have evolved on multiple levels. Gartner recommends that most enterprises adapt to what it calls “Bimodal IT,” referring to a split in the way IT organizations must be structured to achieve both value and agility. Mode 1, or traditional IT, is focused on reducing cost while ensuring dependability, safety, and approval-based governance of applications. Mode 2, by contrast, is both nimble and non-traditional, centered on rapid, continuous development. In Mode 2, IT doesn’t have to be perfect—but it must be quick. In the age of Bimodal IT, enterprises need to find ways to achieve the cost and reliability goals of Mode 1, while increasing the flexibility and speed needed in Mode 2 to bring applications and processes to market faster. As IT is tasked with greater numbers of both business and operational processes, the demands on people, technologies, and resources are increasing. Job scheduling tools—originally created from batch processing—are now called upon to handle multiple and highly sophisticated functions such as business process automation, IT process automation, Big Data automation, and application release automation. These myriad challenges require an IT automation solution that can scale along multiple dimensions: In a different report, Gartner states that IT organizations use, on average, three to eight different scheduling and automation tools to meet the heavy automation loads in their daily operations. Rather than solving the problem, this approach only adds to the workload and complexity of IT—driving up costs and spreading technical resources thinner than ever before. Clearly, scalability is becoming a mandatory trait for automation tools. A unified solution with upside capacity, combined with the ability to reduce reliance on coding and scripting, can produce significant business benefits.

Wider to accommodate more kinds of processes, more applications, more languages, and more platforms.

Bigger to handle deployments and the jobs they encompass, which can grow in number very quickly.

Faster because many processes have real-time, or near real-time, requirements, making dynamic resource provisioning a necessity in many cases.

Page 4: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

3 Key Components of an Intelligent IT Automation Solution Part One: The Scalable Automation Engine

4

Advantages of a Scalable IT Automation Platform Virtually all automation solutions can scale—but to varying degrees and with widely ranging impact. Intelligent IT automation platforms, however, built for heterogeneity and high capacity, can deliver numerous advantages. First, a scalable IT automation solution gives IT the ability to automate a surprising array of business processes such as data warehousing, ETL, business intelligence, and more. These processes can be initiated according to date/time triggers, pre-defined events, and/or various dependencies. While business processes are important, in today’s world IT looks to automation to control other functions as well. IT process automation (ITPA)—e.g., file transfers, employee onboarding/offboarding, and IT service management—is one key category. Another is Big Data automation involving the staging and organization of data using Hadoop and its many ecosystem components (HDFS, Hive, Map/reduce, Spark, Oozie, Sqoop, etc.). Still another is Application Release Automation (ARA), spanning the many steps needed by software developers to design, build, test, debug, and release applications. A second advantage of scalability is risk management. This benefit has become increasingly important in recent years as the pace of technology change has accelerated. The possibilities of the digital age bring both opportunities and threats that weren’t possible—or even imagined—in the past. As IT is called upon to do more and more, a scalable workload automation platform supports these additional tasks without exacerbating concerns about speed, accuracy, or system performance. A third advantage is cost reduction. Many workload automation solutions are priced according to “high water marks” that can be very expensive if exceeded. Scalable solutions, delivered via a subscription or perpetual pricing model, provide the flexibility that not only accommodates unexpected changes in demand, but also minimizes business and operational risk. Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates value through its ability to scale along multiple dimensions, enabling organizations to adapt to change very quickly. This capability can apply to virtually any industry vertical. Ignite Technologies An excellent example of how a scalable automation platform can unify and coordinate multiple processes is in the SaaS service provider category. Ignite Technologies, a Texas-based provider of business service applications spanning finance, HR, content delivery, IT sales, marketing, and security, is a significant and successful player in the SaaS field, serving audiences numbering in the tens of thousands and running over 250,000 jobs with a 99.9% success rate. As Ignite has grown, it has added jobs to a datacenter comprised of a diverse array of servers and database formats that includes SQL Server and Microsoft Hyper-V. The company once used Windows Task Scheduler to handle custom scripts—but since then it has adopted an enterprise-wide scheduling platform. “It’s about scalability,” said Alan Davis, Network Engineer for Ignite. “Could I go and write scripts for everything we want to do? Yes, but it would take forever and centrally managing the workflows would be nearly impossible.”

► Ignite Technologies

Page 5: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

www.advsyscon.com 

Big Data is another trigger for scalability. Statkraft, Europe’s largest renewable energy producer, pays close attention to historical and predictive weather data, merging this information with analytics on consumption and pricing to drive its market forecasts. The volumes of data involved are significant. For years, Statkraft used manual processes to collect and compile its weather data. With a scalable workload automation engine, however, it can connect with a wide range of operating systems and applications. The company’s initial success with such an engine, in fact, gave it the confidence it needed to expand its workload automation platform to a second office, replacing an in-house scheduler. The first office executes 2,000 jobs per day; the second runs an incredible 100 jobs per minute. In a commissioned, independent study, it was determined that Statkraft’s three year, risk-adjusted ROI for its workload automation platform was 153%. The payback period was a mere 3.4 months. What’s more, Statkraft can now make quicker, more informed decisions when there is a shift in volatile energy markets. SThree Mobile device management can stress any fast-growing company—a fact SThree knows firsthand. Now a giant in the recruiting industry, SThree was founded with just four employees and one office. Today the firm has 2,300 employees and 44 offices in 18 countries. “We literally had to run [our manual script-driven processes] around the clock and IT services needed to meet a lot of new business conditions,” said senior partner Garry Lengthorn. A large portion of the jobs run by SThree’s IT organization—a schedule of 520 jobs and 250,000 instances per day—has to do with onboarding and offboarding new employees, many of whom are temporary workers. The company also implemented a secure mobility solution as part of its BYOD program, which further added to its processing workload. With a highly scalable workload automation platform on hand, however, SThree ably handles its processing requirements. Lengthorn reports that his company’s platform has scaled so well, it has not had to change the initial configuration in over ten years. “Our user base has grown significantly, but the same backend processes are still in place,” he notes.

► Statkraft

► SThree

Page 6: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

6

The function of an IT automation platform is to simplify and enhance coding—not replace it. After all, most organizations are not only well-versed in scripting, but also have a heavy investment in job scheduling scripts written using PowerShell, Python, JavaShell and other languages. These scripts are designed for specific products, applications, and services, and are in frequent use. What’s more, developers are usually comfortable with their coding languages. So why, then, is it important to adopt an IT automation solution with rich content? Because an extensive, well-designed and pre-tested content library has the power to foster innovation within the IT organization and, by extension, the broader enterprise. This paper will detail the ways such a library can enhance productivity, create new opportunities, save time, and reduce costs.

Benefits of a Rich Content Library Certainly, a library built on rich content reduces reliance on custom coding. Prebuilt scripts save time, and they give developers the option to incorporate content upstream and downstream from an existing script with little to no additional work.

What is an IT Automation Content Library? A collection of production-ready, self-documenting integrations for commonly-scripted actions that can be easily and reliably assembled to build workflows. What makes for a Rich Content Library? Breadth – A rich content library can address a wide variety of process types in the areas of workload automation, IT process automation (ITPA), big data automation, application release automation (ARA), and more, through integration with a broad range of applications, databases, and platforms. Depth – In addition to addressing key areas of the evolving IT landscape, a rich content library provides a deep level of integration within each of these key areas, allowing developers to automate more without having to write custom code.

Page 7: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

www.advsyscon.com 

More than saving time and effort, however, a rich content library supports the development of new ideas for better business/operational performance. If an IT organization can quickly integrate with a wide range of third-party apps, databases, and operating systems, it encourages innovation. The tested logic in Job Steps—the off-the-shelf “building blocks” found in a content library, ready for assembly into workflows—speeds development of these innovative ideas, as the Job Steps are designed, tested, and maintained by the vendor. Because the vendor ensures the reliability and consistency of the integrations, the developer can thus focus on the function or purpose of the integrations rather than the logic behind them. A well-designed library should have a self-documenting capability that describes the function of Job Steps and assembled workflows in plain language. While comments can be added for further explanation, a self-documenting capability can be invaluable for explaining the “why” of the script—the most important aspect of any documentation. Finally, a good content library allows single Job Steps or groups of Job Steps with pre-populated information to be saved as templates that significantly shorten the coding process. This feature can be essential and not just for reuse by the code author. Templates enable less experienced and/or junior developers to take over future programming tasks; in fact, it can allow them to manage and direct complex processes with a reasonable level of pre-established expertise. The simplicity and power of a rich content library can encourage smarter solutions in multiple areas that comprise the modern IT operation: Innovation in these operations occurs when people are given opportunities to think and create. In most IT organizations, even the most highly skilled developers spend much of their time on tasks such as maintaining, testing, troubleshooting, and fixing code. These tasks take time—but a rich automation library can make workflow development easier and faster, with greater flexibility and more options. Leveraged correctly, such a library can be of immense strategic value. >>Rich Content Use Cases The content library in an intelligent automation platform can liberate an organization to build more robust, distinctive, and efficient solutions that solve problems, provide better information faster—even create competitive advantage. With a push for more agile development, a lot of time can be saved because developers can spend less time researching, designing, testing, debugging; as a result, they can not only deploy faster, but also with greater reliability

Workload Automation - Business processes including data warehousing, ETL, and business intelligence.

IT Process Automation - Managed file transfers, file operations (move, copy, rename, delete, etc.), data transfers, backups, onboarding/offboarding, database operations, and service management.

Big Data Automation - Organizing and staging data using the Hadoop ecosystem (HDFS, Hive, MapReduce, Spark, Pig, Sqoop, Oozie, and others).

Application Release Automation - Automated code builds, testing, deployments, and continuous delivery.

Page 8: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

3 Key Components of an Intelligent IT Automation Solution Part Two: The Rich Content Library

8

The following cases exemplify the value of rich automation content: Baton Rouge, Louisiana-based Lamar Advertising operates more than 315,000 outdoor displays across the U.S., Canada and Puerto Rico. The company’s homegrown job scheduler, however, made it difficult for the IT department to meet its real-time data processing demands. According to Jude Robert, MIS Operations Manager, IT staffers were writing scripts for all kinds of business tasks, from contract management to general ledger entries. “Our homegrown solutions had no inherent functionality,” he recalled. “For example, copying a file wasn’t part of the system. We had to implement even the simplest tasks in code. We also were very limited in notifications.” Once Lamar installed an automation platform with rich content, however, developers were able to quickly build workflows for accounts receivable, data warehousing, ETL processes and more. Interfacing with the data warehouse, jobs could run SQL queries that automatically generated reports for both IT and business users. The Jobs Library also provided built-in execution queues for jobs waiting for execution on a specific or group of machine(s). Because the queues point to the system with the agent installed on it, if the machine changes, it occurs in one place only—the queue property. This means if one of Lamar’s machines changes, the user doesn’t have to change multiple jobs, but rather simply the queue’s “machine” property, saving time and minimizing manual updates. Furthermore, if a specific machine isn’t available at the prescribed time, the automation platform automatically executes the process on another available machine. As the only full-service pediatric specialty healthcare center in Nebraska, Children’s Hospital receives over 350,000 patient visits a year. Its Business Intelligence (BI) system is used by everyone from Finance and Access to Infection Control and even C-level executives. The hospital used to perform ETL tasks via a combination of manual operations and internally-generated scripts. While Cron, SQL Server, Cache and Microsoft Windows were all used, none could handle all the tasks from the facility’s two BI applications, QlikView and Crystal Reports. Using the production-ready Job Steps in an advanced automation platform, however, changed the paradigm. The Job Steps for SQL Server, for example, replaced scripts that delivered reports from the electronic medical record application, EpicCare EMR. The IT team also built workflows that automated over 40 different file transfer processes with 40-plus third party vendors. “We have yet to find a vendor that uses some sort of a secure file transfer method that isn’t supported,” said Jeff Spilinek, ETL architect for the hospital. Children’s Hospital also uses the library to build flexible job queues. “We may have 12 jobs scheduled and, depending on the situation, we might run four at a time, saving another four for later and perhaps the remaining four at 3AM,” noted Wendy Worthing, Director of IT Operations. “That kind of throttling capability conserves resources—in the past, such a capability would have required a lot of manual scripting.”

► Lamar Advertising

► Children’s Hospital and Medical Center of Omaha

Page 9: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

www.advsyscon.com 

Service Library Creates Additional Value Even the largest content library, however, cannot accommodate every automation need out-of-the-box, which is why a supplementary Service Library is essential. A Service Library extends the power of the content library by giving developers the ability to simply yet powerfully load APIs (Web Services, REST Services, .Net Assemblies, Stored Procedures, CLI, etc.) and exposing the methods to make them available for use in the automation workflow developer as structured Job Steps without any tweaks or changes. This allows developers to integrate with and expand their content library with Job Steps for virtually any application without needing to write lines of custom code. By simply dragging and dropping these Job Steps within the automation interface, any kind of workflow can be created. Job Steps can be used once, or saved for reuse. >>Service Library Use Case Türkiye Finans is one of the Middle East’s leading commercial banks, with over one million customers and 200 branches. To schedule the bank’s many IT processes, developers rely on PowerShell, .NET assemblies, Microsoft SCF LOB adapters, and Web Service protocols executed via Windows Task Scheduler and SQL Server Agent. Task Scheduler, however, isn’t able to build event-based workflows, creating bottlenecks for IT staff. “You could easily make a mistake,” recalled Mucahit Yavuz, release and patch manager in IT operations. “These scripts were automating critical jobs for the transfer of financial and transactional data, payments, reports, and so on. Each time we defined a job in Task Scheduler or had to make an update, a single mistake could block the execution of all other scheduled operations.” Task Scheduler could schedule various PowerShell scripts along with Microsoft WCF and Web Services protocols, although a third-party program and an XML configuration file were necessary to run the jobs. But by using the Service Library in their automation platform, a dynamic SOA and Web Services tool supporting multiple integration points including command-line and stored procedures allowed Türkiye Finans to call upon specific .NET assemblies or Web Services, retrieve those methods, and provide the methods as reusable job steps. The values returned from those methods are passed downstream to be used as execution variables to trigger ensuing jobs. According to Yavuz, this ability streamlined what had been a difficult and error-prone process. “We’re now able to process data returning from the methods our automation platform retrieves,” he stated. “It’s a huge time saver and gives us a level of control and management over these processes that never existed before.”

► Türkiye Finans

Page 10: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

10

In a recent study by the analyst firm, EMA, “easier workflow design” was cited as the number one reason organizations would migrate to a different IT automation solution. While this finding may surprise some, it’s also entirely logical: Enterprise IT is, at its core, in the business of executing workflows. It’s essential to know not only how IT workflows are designed and assembled, but also how to do so in such a way that minimizes risk and allows you to deploy faster, enabling businesses to maintain their competitive advantage. In the world of IT, workflow is the process by which information is obtained, defined, transformed, and transmitted according to predefined rules for a specific organizational purpose. Each step in an IT workflow can be considered a job. Jobs are linked by triggers (sequential, conditional, dependency, and so on) to create the larger process. IT Automation platforms further refine and formalize jobs, triggers, and workflows to make even complex IT processes run more efficiently and dependably. In such a platform, commonly-used jobs are either housed in the form of scripts or assembled, using tested logic, as job steps. Recall from the previous chapter that job steps are the prebuilt, out-of-the-box building blocks—integrations for commonly-scripted actions—that are designed, tested, and maintained by the automation vendor so the developer can focus on the function or purpose of the action rather than the logic behind it. Gathered into a content library, these prebuilt actions become the out-of-the-box building blocks that can be assembled into processes. A content library of automated integrations is a powerful tool for developers. The best IT automation platforms allow workflows and their component jobs to be quickly and easily assembled to solve virtually any need across heterogeneous applications, operating systems, networks, even combinations of virtual and physical resources. The workflow assembler adds whatever completion logic is needed. Scripting time, when compared to assembly, can be reduced by 50% or more—a benefit of immense value in an age when developer resources are tight and rates of change for IT organizations are constantly increasing. An excellent workflow assembler can bend the development cost curve downward and allow an IT organization to adapt to change much more easily. But what traits in an automation platform are necessary to facilitate the process of workflow assembly? How can IT organizations use these traits to conserve time, cost, and resources—and most of all, to increase IT agility?

Page 11: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

www.advsyscon.com 

Requirements of a Great Workflow Designer While all IT automation platforms should have a workflow assembly tool, some offerings available today are better than others. The best platforms provide specific features that not only conserve developer resources, but also uncover new competitive and operational opportunities. The following are essential to an exceptional workflow assembler. It must have a strong job steps editor. The ability to easily assemble job steps in a logical fashion is a sign of an advanced workflow assembler. Using a drag-and-drop interface, the editor allows users to select and insert job steps into a job; these prebuilt actions can either be provided out-of-the-box or incorporated via the loading of third party APIs to extend the capabilities of the content library with even more integrations. A strong job steps editor also supports the insertion of existing scripts or executables as well as the combination and/or integration of scripts with prebuilt job steps. Finally, faster and more reliable assembly of job steps is further facilitated by tools that allow developers to reuse rather than rewrite; examples include the use of templated integrations or the ability to copy and paste job steps between jobs. It must be intuitive and dynamic. Many enterprises have built thousands of job templates and may be running hundreds of thousands of instances every day. An assembler that is intuitive (with a short or no learning curve) and dynamic (responds in real-time or near real-time) allows developers to easily manipulate the jobs, workflows, and objects (calendars, queues, schedules, etc.) involved with daily workflows. A strong graphical interface makes it easy to display and construct workflows and, when necessary, change the flow logic of the workflow. It also shows the dependencies and constraints that may exist on a particular job, for example, what happens when a job fails, completes, or is aborted, making it easier to execute changes quickly and reliably. It provides quick insight into workflow design. A key to good workflow design is the ability to improve performance by gaining insight about how a workflow’s “shape” impacts runtime. For example, if “A”, “B”, “C”, “D”, and “E” are running sequentially, a developer should be able to determine what happens if “B” and “C” are changed to run in parallel. A graphical view makes this possible; it also reveals how changes to the shape affect critical path as well as total elapsed time and other parameters. It allows workflow simulations. Well-designed assemblers can support workflow test runs without actually running the payload. The ability to simulate a workflow without executing long-running payloads allows developers to identify, test, and resolve completion triggers, constraints, completion status, workflow path, and other logic issues. They can also configure the simulation with failures, errors, and success conditions for testing purposes, without ever waiting for long-running payloads to complete.

Page 12: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

3 Key Components of an Intelligent IT Automation Solution Part Three: The Intuitive Workflow Designer

12

It supports Check-Out | Check-In. A powerful advantage for designers, Check-Out | Check-In functionality supports team collaboration as well as exclusive access for individual developers. When collaboration is desirable, multiple team members can work on the same workflow simultaneously; members can check out an object and make changes, and before checking in those changes, the feature allows developers with different ideas to resolve conflicts. Other times, one developer may need to exclusively check-out an object, preventing other developers from modifying the same object. In either case, all or only part of any change can be adopted, while an audit trail keeps track of what changes were made, when, and by whom. Check-Out | Check-In capabilities allow a job to run normally while the job is optimized or otherwise changed offline. Importantly, it also supports rollback of changes and groups of changes (referred to as changesets) when necessary. As many enterprises have found, it is critical for updates to be rolled back when major problems occur. Changesets provide a central place to track changes and to revert all or part of a change if needed with a few clicks, thus shortening time to failure resolution while also avoiding time-consuming and difficult system restores. It should have an emphasis on governance. All workflow design should be accomplished with an emphasis on policy-driven governance. A comprehensive automation framework provides organizations, in a single pane of glass, an authoritative source to know exactly what’s running and when, with the ability to easily control and monitor the status and security of workflows. Key features to achieve effective governance include a comprehensive audit framework, ability to incorporate security-driven policies, revision history, rollback/restore options, and changesets (described above), Check-Out | Check-In Functionality (also described above), and management/monitoring of jobs with service level agreements (SLAs). It must simplify recurring tasks. Exceptional workflow assemblers include Reference Functionality that allows developers to reuse critical portions of the same job or plan for different workflows. Reference Functionality allows hundreds, or even thousands, of reference jobs to mimic the same logic of the original template job. This capability can significantly speed the design and modification of workflows, as well as reduce clutter and improve job management. Take, for instance, a group of FTPS processes using TLS (Transport Layer Security). In a typical automation platform that uses traditional copy-and-paste, a change that needs to be made to the source, target, TLS protocol, or other parameter would require the developer to go in individually to each job to make the change—a tedious and highly repetitive task. With Reference Functionality, however, the developer can make a change to a template and have that change automatically passed down to each workflow that references the template. Coupled with the use of variables, large numbers of jobs can be easily and reliably maintained. Thus, Reference Functionality and use of variables allow developers to dynamically update workflows based on changing business and IT requirements.

Page 13: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

www.advsyscon.com 

>>Use Cases The power of an advanced workflow assembler can have a profound effect on process performance and resource utilization, as building materials distributor PrimeSource has learned. One of the largest distributors of its kind in North America, PrimeSource employs 1,200 people in 35 distribution centers across the U.S., Canada and the Caribbean. It recently made the move to SAP for all its ERP tasks including data warehousing, inventory management and invoicing. Combined with a shift of nearly all its systems to Windows, it became difficult for PrimeSource to continue using IBM Tivoli Scheduling for its automation tasks. Beginning with its data warehousing operations, the company began transferring workflows to a new, cross-platform IT workload solution. PrimeSource had purchased SAP Rapid Marts as an ETL tool; employing Business Objects (BOBJ) and Data Services, it found that one process was taking 9-1/2 hours to complete. “The process was very cumbersome,” stated Matt Sullivan, BI Manager for PrimeSource. Using its workflow assembler in its new platform to change the shape of the process, however, yielded big results. “We were able to make [the process] as modular as we wanted, running things in parallel,” Sullivan recalled. “With our new automation platform we cut the processing time down from 9-1/2 hours to just one hour.” Using a highly intuitive graphical interface is another way to save time, especially for developers. At its corporate headquarters, sandwich restaurant leader SUBWAY processes operational and sales data on a near real-time basis from more than 30,000 restaurants worldwide. The company relied on various SQL servers to schedule SSIS (SQL Server Integration Services) and DTS (Data Transformation Services) packages between a legacy OpenVMS system, a newer .NET/SQL platform, and its data warehouse solution. “Being able to dynamically manage the dependencies between SSIS and DTS jobs was critical, because we couldn’t run the data warehouse load until the OpenVMS and .NET platform was populated and synchronized,” stated Ciana Barrueco, SUBWAY Database Administrator. Once SUBWAY installed its more advanced enterprise IT automation platform, however, it was able to use the application’s graphical interface to plan its workflows, along with dependencies between jobs. “Our new solution requires 75% less development time than SQL Server Agent to build new workflows and update existing ones,” Barrueco estimates. Not only are resources better utilized, but processes now run more reliably as well.

► PrimeSource

► Subway

Page 14: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

3 Key Components of an Intelligent IT Automation Solution Part Three: The Intuitive Workflow Designer

14

The ability to reuse jobs through resource utilization allowed appliance maker Sub-Zero’s IT department to significantly increase its agility. As the company grew, its collection of scheduling tools—SQL Server Agent, Windows, Task Scheduler, and an iSeries scheduler—were becoming increasingly unwieldy. Jobs couldn’t be triggered across different environments and platforms, especially for BI and data warehousing purposes—and automating jobs of a similar nature was a major drain on developer time. By adopting an automation solution with Reference Functionality, Sub-Zero was able to dramatically speed up workflow assembly. In one case, Sub-Zero has a job that performs database backups, accepting parameters for the server, database, backup type, and destination. To perform full, differential, and log backups across every SQL Server in the enterprise, IT staff uses its reference capabilities to create three references per SQL Server. “This allows us to configure schedules separately, but still have one job,” said Jason Van Pee, Sub-Zero’s database administrator. “If we decide to backup using a different method, we only need to change one job, and all our SQL Servers are set.” The referencing functionality allows hundreds or even thousands of reference jobs to mimic the same logic of the original template. Thus, a single change to the template object will be automatically passed down to each reference without the need for further action. The result is faster workflow creation—and more importantly, easier maintenance of these jobs in the future.

► Sub-Zero

Page 15: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

www.advsyscon.com 

Greater Than the Sum While an advanced workflow designer/assembler is a major advantage for creating, testing, optimizing, and maintaining processes, the combination of such a designer/assembler with a rich content library can generate even greater benefits. The second part in this series addressing the rich content library showed how such a library can simplify and enhance the scripting process. Together, a rich content library and a powerful workflow designer/assembler have been shown to cut workflow development time by 50% or more. The first part on the automation engine demonstrated the ability of IT automation to increase the numbers, types, sizes, and complexities of different business processes and how to execute those processes faster, more reliably, and at lower overall cost. A scalable automation engine can also reduce risk, accommodate change, and lower operational costs. Scalability, a rich content library, and an advanced workflow designer are the trademarks of intelligent IT automation. With enterprises being called upon to do more, on more levels, at higher speeds, more reliably, and with greater cost-efficiencies than ever before, it’s essential that IT organizations fully leverage all the capabilities that modern IT automation can provide.

Page 16: Edit Condition Condition ArchivedCondition · 2020-02-20 · Use Cases for Scalability What users should be looking for, then, is a powerful, affordable automation engine that creates

ActiveBatch® is Redefining IT Automation Take Your Organization . . .

From an Elemental Approach To an Architectural Approach

From Multiple Scheduling Tools To a Single Automation Solution

From Custom Scripting To Production-Ready Actions

From a Legacy Job Scheduler To a Solution Designed for Change

Start Redefining Your IT Automation Learn More or

Request a Demo >>

www.ActiveBatch.com