Tag Archives: batch workload
In the 21st century, we’ve given up our tool belts, relinquished the tape robots, and enjoyed the ease of keying in parameters, code fixes, and more on a PC dedicated to our needs. So, perhaps, it’s time to give up things like hand-managing batch performance. Sometimes, we stick with things because they’ve become habit. We know there is probably a better way to do them; but, we don’t want to fight the battle to get new software and face a learning curve.
Companies are more cost-focused than ever before. While some industries have always had narrow margins, every company is looking for cost-savings wherever possible. Soft-capping can be scary, but you still need to save money. So what do you do? The solution is LPAR sets.
Every year, when the switch to daylight savings time takes away an hour of sleep but gives you back light in the evening, many of us think about doing some spring cleaning. Have you ever considered how you could do some spring cleaning around the office? While cleaning your workspace is a great idea and one too rarely considered, you’re going to get more points from looking at some ‘systems spring cleaning.’ What would that look like?
For those exploiting the benefits of sub-capacity pricing to reduce IBM software costs, the Rolling 4-Hour Average (R4HA) is a critical measure to watch—but are you watching it effectively? If you’re like many systems programmers and capacity managers, you’ve probably created a specific job to keep your company’s R4HA in check.
This job likely focuses on key online workloads—those that impact the costs of CICS, IMS, DB2 and other licenses—and monitor changes to transaction mixes, new workloads and code, which can all cause your average to creep up, leading to increased software charges. Despite this effort, however, you may continue to see your R4HA rise, because you’re missing one sneaky cost contributor: batch.
In previous posts, we’ve talked about how you can reclaim time for more important tasks by automating batch and getting it off your plate using ThruPut Manager. But many are wondering: what types of important tasks can ThruPut Manager really alleviate?
A current and very real example is the challenge of transforming gobs of data into usable information. It’s not getting any easier. Have you ever seen a hardware or software release from IBM that reduced the amount of data you had to process? With more capabilities, you get more data to process, analyze, correlate, and understand. The problem with data is that it isn’t actually information. And in going through it to find the information you need to do your job, you find the time is simply sucked away.