When is the best time to refresh your IT?
6 min read
04 November 2015
The IT industry never stands still and neither can an organisation’s IT infrastructure, with new products and services entering – and leaving – the market on a daily basis.
Companies must eventually refresh their IT but finding an appropriate time to schedule a refresh, and knowing how often to carry this out, can sometimes leave IT managers and business owners in a quandary.
Frequent new releases can cause more than a few head scratching moments as a decision is taken over whether to carry out an upgrade immediately.
Rather than jump in straight away however, it’s often advisable to give new releases some “settling in” time in order to look at how they are being received in the market. This also allows time for any initial bugs to be smoothed out.
When deciding whether to refresh infrastructure, it’s also important for specifiers to assess the business impact. If an organisation feels that insufficient IT support is starting to impact its performance in the market, however, it is clearly time for a change.
In deciding how and when to carry out a refresh, IT teams need to take a variety of elements into account, from technical considerations to the ever-important financial aspect. For starters, organisations need to explore licence costs for old and new software, maintenance fees for old hardware, and the administrative costs of managing it.
Where to start?
When prioritising what to address first, specifiers may look to server-side refreshes which often bring more immediate benefits compared to desktop overhauls. Yet end users might contest that.
New server architectures are primarily attractive due to better equipment densities and lower power requirements. Converged systems also require less cobbling together and “single pane of glass” management.
That said, desktop upgrades cannot be neglected and will eventually have their time too. It’s important to take into consideration the whole cost of managing and maintaining existing devices. These days, most devices are reliable from a hardware perspective and most issues are resolved either by “turning it off and on again”, re-imaging or adding more memory.
Where software appears to be the main consideration, consider whether the legacy device can work with the new platforms or upgrades that the business needs to adopt. Also conversely do you have an application that needs to run on legacy devices no matter how much you want to upgrade?
Currently, on average we’re seeing laptop upgrades roughly every three to four years and desktop upgrades every four to five years.
This is primarily due to technology hitting a plateau when it comes to performance. By this, I mean that processor power, even on lower level chips, is sufficient for most office applications or use. Apart from adding more memory, devices bought three to four years ago are still fit for purpose.
With this in mind, it’s important to know how old your IT estate is. IT managers can then go away confident that they know exactly when equipment reaches its optimum life and subsequently, when those devices need to be refreshed. This way, there should be no devices in your estate over their cut-off point.
Making smarter decisions
Improved longevity of client side technology provides an opportunity for recycling hardware within the business which will still allow for investment where the latest and greatest technology is required.
Wiping, rebuilding or adding more memory to devices will improve the user experience for administrative users, who are often overlooked when it comes to new technology.
Client devices such as desktop and notebook products are becoming more and more varied and commoditised, allowing resellers to offer subscription based devices where a business pays monthly over three years for a device. This can free up the funds to invest in the supporting infrastructure and consider modern options of infrastructure deployment in hardware, software or hosted environments.
Infrastructure refreshes will inevitably require an investment. Yet embarking on this type of refresh can actually be a good opportunity to pause and explore architectural changes that could drive down costs and improve efficiency. A move to a Virtual Desktop Infrastructure (VDI) or a thin client model could offer one way for an organisation to meet these aspirations.
Thin client computing can help to centralise information, making governance, risk management and compliance much easier for the business.
These types of architectural shifts may streamline costs in one area of the infrastructure, but they often incur more refresh costs in another part. VDI will usually entail server and networking upgrades for example. In the complex world of IT infrastructure management, there’s always a trade-off somewhere.
Gary Price is product and category manager at Probrand.