Happy 50th Birthday Mainframe!

To celebrate Mainframe’s 50th birthday some of our Triton Consultants share their favourite mainframe stories:

My first contact with the mainframe was as a lowly graduate Trainee Programmer at a large chocolate manufacturer. All programs were stored on punch cards; partly because they were very old code and partly to give us youngsters a sense of history. As part of my training I was slated to do a few nights of ‘Ops’, including a graveyard shift. Much of the work was collecting vast sheets of green-lined print out from the laser printers and part of it was loading programs to be read in and executed on the mainframe. One particular program was something that had been written to calculate the potential cocoa bean crop. It was a big enough program that it had to be delivered on a trolley, from the card store, down to the mainframe room and then inserted into the reader in blocks about the size of a house brick.

Well, I managed to wheel the trolley down to the mainframe room, and then sat there for about an hour feeding cards into the reader. The mainframe digested this code and went off to calculate the results, a process that was going to take about another 4 hours, so I started to wheel the trolley back out of the data centre over to the long-term card-store, through the car park. Unfortunately, as I was making my way through the car park (it was about 4 a.m. and this was back in the days when I had a social life, so I probably wasn’t at my most alert), I put one wheel in a drain and tipped the whole cart over. Punch cards everywhere.

I’d like to say that I owned up, dragged the whole lot back inside and spent hours and hours putting all the cards back in the right order. What I in fact did, was stuff the whole lot back into boxes willy-nilly, on the grounds that it was going to be 12 months before the thing needed to be run again. I doubt if the mainframe made much sense of the program the next time it saw it but, if it did succeed in reading the thing, I’m pretty certain it came up with a wildly inaccurate forecast for cocoa bean production.

Anonymous, Triton Consultant!


When I started working with mainframes in 1985, my first employer had just moved away from the use of punched cards as a means of inputting programs and data to the IBM machines (an IBM S/370 3081 I think). This meant a large number of unused punched cards suddenly became surplus to requirements, and people found some very inventive ways of making use of them – the most popular being prompt cards for speakers to use when giving presentations. I remember bringing home a stack for my mum to use as recipe cards, and she still has a few of them today.

Julian Stuhler, Triton Director


As a relative newcomer to mainframes, having missed out on the first 20 years of mainframe history, one of my most vivid recollections is relatively recent, from back in the late 1980′s whilst I was working as Computer Operator in a small Data Processing department (IT didn’t exist back in the 1980′s. It was all Data Processing).

Having years earlier consolidated our Sperry Univac and Honeywell based applications onto an IBM 4381 MVS/JES2 complete with banks of tall, shiny, blue 3380 cabinets, we had just performed our first major IBM mainframe upgrade, moving to a 3090 MVS/ESA. All had gone well and availability, particularly compared to the days of Sperry Univac and Honeywell, had improved massively.

A few months on during an unremarkable night shift, whilst determining which takeaway establishment would be lucky enough to provide us with our evening meal (one of the most important responsibilities of a shift leader), there was a ring at the delivery door. No one rang the delivery doorbell other than local kids messing about. And it was a long walk down the dark, echoing corridor from the Computer Room to the delivery door.

“It’ll be kids. Just ignore them” advised one of my team urging me to make a decision on takeaway choice by shuffling a variety of menus in front of me.

Another, longer buzz from the delivery door.

“I’d better go. Even if just to warn the kids off”

I left the strangely soothing hum of the computer room to head down the eerily silent, dark, echoing corridor. To my surprise it wasn’t kids at the delivery door but a delivery man: -



“Parcel for Data Processing Computer Room. Sign here please.”

“But we haven’t ordered anything?”

“Well it’s for this address. From The Netherlands. IBM by the looks of it”

“IBM in The Netherlands? We haven’t called IBM for anything. I’ll sign anyway”

I wandered back down the long, dark, echoing corridor to the safe haven of the Computer Room, puzzling over the mysterious parcel. The unremarkable night shift continued without further excitement or event, other than plentiful amounts of pizza, until daylight broke about 07:00 and soon after another buzz at the delivery doorbell. Someone’s in early I thought. A quick jog down the long corridor and eventually opened the delivery door, but not to someone I recognised.

“Morning. How can I help?”

“Hi. I’m Jeff the IBM engineer. I’ve come to fix one of your 3880 disk controllers. I assume the new part turned up last night from The Netherlands?”

“There’s nothing wrong with our 3880s? And we certainly didn’t raise an issue with IBM, but yes a parcel from the Netherlands turned up last night.”

“Ahh I’ll explain”

Over a cup of tea, Jeff the IBM engineer explained that the new 3090 mainframe configuration also included self-diagnosis software. What had happened was that the self-diagnosis software had identified performance degradation with one of the 3880 disk controllers, realised which component needed replacing, contacted the relevant department to order a new 3880 part from The Netherlands, and scheduled an engineer visit. Self-diagnosis had proactively identified a potential issue, preventing a possible outage. These days the response would have been ‘so what’, but nearly 25 years ago this was the stuff of science fiction, given the internet was in its infancy and unheard of to the vast majority of people. Even email was fairly new.

Often labelled as ‘legacy’ and ‘prehistoric’, even a quarter of a century ago mainframe hardware and software was right at the cutting edge of technology with innovative ways of providing high availability. And self-diagnosis, followed by part ordering and engineer booking, is still something well beyond many other of today’s operating platforms.

Paul Stoker, Triton Director


I started work on mainframes in December 1983. Access was via a terminal called a 3277, which was green characters on a black background. I had access to 2 mainframes. One ran an operating system called VM and was used for supporting the email system. The second ran MVS, the forerunner of zOS. It was used to run a system called RETAIN which was IBM’s defect support system. 30 years ago RETAIN was already 24/7 and supporting data mirroring!

You could only be signed onto 1 of the systems at a time. To sign onto the other you had to log off, physically turn a switch and log onto the other.

Nick Smith – Triton Associate Consultant


Posted in IBM, mainframe, System Z | Tagged | Leave a comment

IT cost reduction & optimisation top the priority list for Mainframe customers

In their recent Mainframe study, BMC Software asked Mainframe customers what their top 4 IT priorities were. Cost reduction and optimisation came out at the top with 85% of respondents citing Cost reduction as one of their key priorities.

The study went on to ask customers about their 4 Hour Rolling Average (4HRA). This is the workload figure which is used by IBM to determine software costs. Batch jobs in various forms determined the peak for 62% of respondents with prime online processing at 37%.

One of the questions which our consultants are often asked is “how can we reduce our Mainframe software costs”. It can often seem that there is nothing that can be done to reduce these costs. Simply maintaining the status quo and not allowing costs to rise can be a challenge in itself for capacity planning teams.

When looking at batch processing, an unmanaged workload mix can greatly affect whether batch processing is contributing to a large chunk of Mainframe software costs. By unmanaged workload mix we mean that sometimes, without realising it, organisations can be running non-essential batch processing jobs during the prime shift and pushing the workload peak up significantly. Every month an IBM workload report is produced using the Sub-Capacity Reporting Tool (SCRT) and sent to IBM. This is used to determine the peak workload and therefore what charges will be applied for the software used. By carefully analysing the SCRT report and related SMF data it is possible to gain a clearer view of where peaks are occurring. Moving batch workload to a different time may make it possible to bring peaks down and reduce the 4HRA and thus reduce costs.

Badly performing or slow running applications are another source of woe when it comes to pushing up Mainframe costs. The graph below shows an example of MSU usage by application. If the total peak is 1900 MSUs this is what software products will be charged at.  For example, if your CICS application has not been tuned as well as it could be or you have an issue with performance and it is using up more MSUs than it should then the entire peak will be raised and the associated software costs will increase to the peak. There are potentially significant savings to be made by ensuring that the system is running as efficiently as possible.

zTune graph



These are just a couple of examples of the ways that organisations can reduce their Mainframe software costs. There are many, many more. Triton’s full zTune service looks at each and every one of these options and can bring organisation savings of 5% for a Phase 1 study and a further 10-15% for Phases 2 and 3.

With 93% of respondents in the BMC study indicating that the Mainframe is a long-term business strategy, finding ways to optimise and reduce costs is going to be vital for organisations in the years to come.

Find out more about zTune

BMC Survey



Posted in System Z, z/OS, zTune | Tagged , , , | 1 Comment

Triton customer CPA Global talk about why Consultancy on Demand works

CPA Global is the world’s top intellectual property (IP) management and IP software specialist, and a leading provider of outsourced legal services. With offices across Europe, the United States and Asia Pacific, CPA Global supports many of the world’s best known corporations and law firms with a broad range of IP and broader legal services.

Triton Consulting provide CPA Global with RemoteDBA services and Consultancy on Demand.  In this podcast we talk to Juandi Abbott, Business Integration Manager, about her experiences of working with Triton:


As part of the Consultancy on Demand service Triton have carried out:

• Training and workshop sessions conducted on-site

• Reviewing and updating database and instance level configuration parameters

• Designing and implementing a data placement strategy (including tablespace and bufferpool configuration)

• Designing and implementing a housekeeping strategy (known locally as RRR housekeeping: REORGs, RUNSTATS and REBINDs) specifically interfacing with the customer application

• Performance advice for specific queries and application workloads

• Database-specific assistance during application releases

• Design advice on DB2 features and tools and how these can be used within the application.

Find out more about Consultancy on Demand

Read more case studies

Read more testimonials


Posted in CoD, DB2, DB2 Administration, DB2 Health Check, DB2 LUW, DB2 Monitoring, DB2 Support, DB2 Training | Tagged , , , | 1 Comment

Great NEW event from IDUG in the UK – Technical Seminar London 2014

IDUG and IBM have announced a free one day technical seminar in London this April. This is a really special event which will have two of IBM’s top Distinguished Engineers; John Campbell from the Silicon Valley Lab and Namik Hrle from the Boeblingen lab as guest speakers. We are also delighted to announce that our very own Julian Stuhler will be opening the keynote session.

The seminar is being held at the IBM Client Centre in London’s South Bank and there will be a DB2 z/OS and DB2 LUW track.

To find out more and book your place, visit the IDUG website.


Posted in DB2, IDUG | Tagged , | 1 Comment

Are you currently experiencing issues with DB2 database management and resourcing?

Find out how organisations like yours are managing these issues.  Whether you have a small in-house team or are struggling to find DB2 resources, RemoteDBA can provide DB2 support and management around the clock, wherever you are in the world.

Register for the webcast – 18th March

The webcast will cover:

  • Introduction to RemoteDBA
    • Fundamentals of the service
    • The process of problem identification and resolution
  • Real-world examples of RemoteDBA working for global customers
  • Partner introductions – KBCE and DBI Software


Iqbal Goralwalla – Head of Managed Services, Triton Consulting Ltd & IBM Champion

Klaas Brant – Owner, KBCE & IBM Gold Consultant

Scott Hayes – President & Founder, DBI Software & IBM Gold Consultant

Who should attend

This webcast is designed for Database Managers, IT Managers, IT Directors responsible for the smooth running of DB2.

Posted in DB2, DB2 Administration, DB2 Monitoring, DB2 Support, Remote DBA, Uncategorized | Tagged , , | 1 Comment

Mainframe customers who have outsourced their IT infrastructure could be missing out on cost saving opportunities

If you are a large organisation running IBM Mainframe you could be paying too much for your enterprise software. Most Mainframe software charges are based on peak CPU usage, so whatever your peak in a given month is what you pay for.

If you’ve outsourced your IT Infrastructure these peak charges can be passed onto you as the client. Outsource contracts are often constructed on a peak usage model. A reduction in peak demand translates into a reduction in cost. Even if your IT infrastructure is outsourced you can take control of your mainframe costs.

“I have seen many large mainframe customers struggle to get a clear view on when their workload peaks occur during the month across all mainframe workloads. This can be especially difficult for customers who have outsourced their IT infrastructure. Without this vital information it is very difficult to get workloads tuned effectively. Only once these peaks have been identified can organisations really bring down the cost of their mainframe software licencing through tuning activities” Paul Stoker, Director, Triton Consulting Ltd.


There are three key areas when looking at Mainframe cost reduction:


1 – LPAR Optimisation


2 – Workload Optimisation


3 – Workload Tuning


An outsourcing provider may not offer to run these optimisation and tuning projects as a standard part of their service. However, it is very important for organisations to take ownership of their mainframe software charges, whether they are outsourced or not, to ensure that they are keeping costs in check and keeping workload running at optimum performance levels.

The benefits of mainframe tuning can be felt across the entire business. From the CFO who will see significant reduction in IT spend through to the IT teams who benefit from improved application performance and thus improved customer service, a thorough tuning exercise can indeed improve business performance.

Find out more about the zTune service or download the white paper


Posted in CIO, zTune | Tagged , , | 1 Comment

Database Reliability and Deployment Trends Survey 2013

IBM DB2 customers are among the most satisfied in the industry

A recent survey by ITIC – http://itic-corp.com/ has highlighted DB2 as providing the highest reliability and customer satisfaction ratings for product performance, security, technical service and support and the value of it’s pricing and licencing agreements.

The report results indicate that the inherent reliability of the major database platforms remains strong and continues to improve as the technology advances.

The survey explored a number of topics including:

• Inherent database reliability, high availability

• Impact of increased workloads on reliability

• Satisfaction with their vendors’ technical service and support and product warranties

• Specific situations that negatively impacted database reliability

• Manageability and ease of use

• Security

• Mission critical confidence of the database to support data intensive workloads

• Load performance/reliability of databases

Top stats from the survey

86% of IBM DB2 customers chose the platform for its reliability

Just 56% of respondents selected Oracle DB for its reliability

76% of IBM DB2 customers selected it on the basis of its high performance

87% of IBM DB2 users gave the database high marks for manageability

Just 46% of Oracle DB users gave the platform high marks for manageability

40% of organisations now keep 1 to 5 Terabytes of data in their databases

39% said that database integration and interoperability issues have the most impact on database reliability

84% of IBM DB2 survey participants indicated that performance improved “significantly” or “somewhat” in the past 12 to 24 months

79% of businesses have not done a total database replacement in the last three years

The survey went on to state that “results show that even the most inherently reliable databases and server hardware platforms can be undone by human error or lack of adequate manpower in the IT department”

Make sure your DB2 teams have adequate support through Triton’s range of:

Consultancy – http://www.triton.co.uk/consultancy/


Remote Support – http://www.triton.co.uk/db2-support

Download the full survey


Posted in DB2 | Tagged | 1 Comment

Seeking DB2 Stars for DB2′s Got Talent

Triton Consulting are thrilled to see the return of DB2′s Got Talent. This competition has really grown in popularity since its inception by our partners DBI Software in 2011. It’s a great way for the DB2 community to share knowledge and celebrate great DB2 talent around the world.

We’ll be sponsoring the competition again this year, and we are keen to encourage anyone in the DB2 user community to get involved. Contestants are invited to share their DB2 experiences gained during the past year with their peers. There are also some fantastic prizes on offer this year too, including:

- A free badge to any IDUG conference in 2014 plus $1,500 travel

- A free registration to The DB2 Symposium plus $1,000 travel

- An Apple iPad

- One of 10 $50 Amazon.com Gift Certificate

DB2′s Got Talent will be held on Fridays during February and March. You can find out more about the competition here.

Our Partners, DBI Software are always interested in hearing from aspiring new DB2 talent and the earlier you submit your application the greater your chances are of being selected to participate. So what are you waiting for? Download the application form today!

Posted in DB2, Uncategorized | Tagged , , | 1 Comment

Database Support In the Public Sector

In recent years the Public Sector has faced difficult choices when it comes to cost management. The IT budget, along with everything else, has been under pressure and IT departments across the country have had to reduce costs and headcount.

Cost challenges increase risk

With reduced headcount comes an increased level of risk. With fewer IT support staff it is far more difficult for operation-critical data to be successfully maintained, leaving the organisation vulnerable to outages and uncontained performance problems.

“Having worked in the public sector I am fully aware of the budget pressures that IT departments are under. Tough choices have to be made but the mission critical database and applications within an organisation must be maintained in order for uninterrupted service to continue” Somu Chakrabarty, Senior DB2 Consultant, Triton Consulting Ltd.

Organisations running DB2 will find that without a dedicated DB2 DBA they can face significant problems in keeping their data available. A member of support staff may be trained in another area of database management but other DBMS systems have very different structures and commands to DB2, this means that staff skilled in other systems may struggle to cope if issues arise in DB2. The cost to the organisation if one of the DB2 applications were to go down can be significant.

Organisations can reduce their annual DB2 support costs by up to 75% with RemoteDBA Office compared to one full time DBA.

Cost effective 24/7 support

Many public sector organisations require their data to be available 24/7 in order to provide services such as social care and emergency support. This can be a real challenge in an environment where budgets are tight and expertise is limited. Engaging an out of hours only support service to cover those times when the existing staff can’t be available is a cost effective solution to keeping vital services running.

Reducing training costs

Those organisations that are able to keep their existing DBA staff face the additional challenge of keeping up to date with current technologies. The price-tag attached to training and development can be hefty but without staying up to date organisations can face real risks of non-compliance, badly performing workloads and decreased efficiency. Working with an accredited partner for DB2 support means that they can plug any gaps in knowledge without the need for costly training.

One solution to all these challenges is to engage an external organisation with the specialist skills needed to cover what is lacking in-house. A RemoteDBA service means that an organisation can benefit from expert DBA support for a fraction of the cost of a fulltime inhouse DB2 DBA.

  Full Time DBA RemoteDBA
No salary, training or other benefits to pay? x a
Holiday cover provided? x a
Sickness cover provided? x a
Additional DB2 performance tooling included? x a

With RemoteDBA from Triton Consulting you can be assured of a proven, stable and secure process for the cost-effective management of all components of your DB2 infrastructure.

Find out more


Posted in DB2, DB2 Support, DB2 Training, Remote DBA | Tagged , , | 1 Comment

DB2 11 for z/OS – Easing Upgrade Anxiety

Friday 6th December

Join Julian Stuhler of Triton Consulting for the DB2Night Show this week where he will be helping to take the anxiety out of your next DB2 for z/OS upgrade.

Despite compelling business benefits to upgrade to new versions of DB2 for z/OS, many customers are unable to schedule migration projects in a timely manner. Two of the major issues responsible for these delays are scheduling the necessary remedial work to address any application incompatibilities introduced in the new release, and obtaining suitable change slots to allow the upgrade itself to proceed.

This presentation provides an overview of the new features introduced in DB2 11 for z/OS that are intended to address these issues, and allow customers to upgrade their DB2 systems more quickly and with less disruption than before.

Book your place here

Posted in DB2, DB2 11, Julian Stuhler | Tagged | 1 Comment