The Future For Software In 2025


The Future For Software In 2025

Software lives. Each and every application and smaller component element of code is created through a living process that we quite directly call the software development lifecycle. But the SDLC is really just a term for applications and data services, there's also a wider living progression that happens as a result of the total technology industry's efforts to evolve and extend. Sometimes it's tough to know where the two streams will meet, but it's fairly obvious when they clash and the wider path of software evolution threatens to break existing systems or leave them fragile, brittle and vulnerable.

All of this combined reality means that - if for no other reason than as a way to mark off the year-end - we often tend to use this period to reflect on what has happened in technology and look to what we might be able to expect in the twelve months ahead.

Only the most fanciful prediction purveyor would dare to suggest that we'll hear less about artificial intelligence in 2025. One day soon (spoiler alert: it's not next year) we'll be able to talk about AI as an implicit functionality that features inside software applications. Rather like the way you no longer get excited by spellcheck functions, speech recognition and (if you're really progressive) real-time application responses that occur in the speed of a click, at some point we will be able to think of AI as an intelligence function that forms part of the way both enterprise and consumer software applications work.

This "absorption" of assumed AI may not happen until the end of the decade and we may well be focused on smart automation accelerator technologies for a long time after, but smart intelligence will begin to weave itself more subtly into the fabric of software as we move forwards.

"The introduction of ChatGPT two years ago sparked an 'AI summer' of massive excitement and investment. According to a recent CNBC report, $26.8 billion was invested in nearly 500 generative AI deals, extending the 2023 trend when GenAI companies raised $25.9 billion. The "bubble" is unlikely to burst in 2025, but we are entering an "AI fall" as organisations struggle to scale the implementation of AI and where investors, business leaders and boards start expecting returns on their investments," said Kjell Carlsson, head of AI strategy at Domino Data Lab in his end of year address to practitioners.

That doesn't necessarily mean that the AI bubble is likely to burst, but it might mean that enterprises start to see through the hype and work out where they can apply AI (if at all) in terms of practical real world use cases.

As we've said before now, AI will not replace HR managers and AI will not replace software application developers, but there will be some realignment of the way industrial processes work and it's not outrageous to suggest that developing nations with large customer service industries such as call centers (or other back end administrative job functions) could experience job losses.

In the hunt for the next big thing, the debate will likely carry on between the slow emergence of quantum computing and the quiet revolution that is photonics light-based computing. There should be no knee-jerk reactions here i.e. in the immediate months ahead we can envisage a considered and methodical hybrid adoption of these advancements unless they are applied close to the source for specialized sophisticated use cases. Where we will hear about quantum is probably in the security space.

"Many of the fears expressed after the arrival of ChatGPT and the presence of AI in everyday life have not materialized like autonomous systems threatening human engagement and AI's potential to brute force data encryption. This is not meant to downplay the impact that we've seen with the emergence of AI, this is meant to illustrate the significantly greater impact we are going to see from the advent of quantum computing. In 2025 organizations need to at least develop a plan to migrate to encryption that isn't vulnerable to quantum computing attacks," said Brian Spanswick, CIO and CISO at Cohesity.

There will be other key developments that cause less of a fanfare than those mentioned so far; the rise of so-called hyper-personalized experiences is around the corner. This term is as potentially confusing as the question of whether or not we should hyphenate hyperpersonalized or not.

This software application development will manifest itself at various levels. We might expect website pages to alter their form and function according to user preferences. When you install an application on your device it may now be pre-populated with policy-approved data that the user has agreed to share across their device and its services.

Service chatbots will know more about you and possibly (although this is neither a threat nor a promise) be able to provide accurate conversational AI-driven services that actually convince users to opt for a computer-assisted experience rather than demanding a human hand-off. The rise of hyperpersonalization could also see us exposed to dynamic pricing regimes, finely tuned customer-specific offers and location-based updates related to what we want to buy or experience.

None of these developments will happen without us questioning sustainability and, in the case of AI services that stem from cloud datacenters in particular, the amount of power needed to run these backbone services will be brought into question.

How much software service is too much software service? We probably don't know because there is such a gargantuan effort to deliver new cloud-native functions and organizations are currently attempting to drink from the firehose of so-called digital transformation innovation. Next year could see a new degree of reckoning in this space as the IT trade itself and international governments start to look at real world power usage and consumption.

But at the same time, new efficiencies may arise.

We may also see a surge in large-scale enterprise software re-factoring (the rip-and-replace process of replacing outdated legacy code bases with new software) as part of a wider march towards modernization. But firms will be looking for solutions for restructuring that a) don't break other existing systems and b) don't break the bank.

"The movement toward modernizing legacy systems will uncover some harsh realities surrounding generative AI's ability to execute," proposes Diffblue CEO Toffer Winslow. "Most refactoring processes start with deterministic rules that are sometimes supplemented with generative AI techniques to deal with edge cases. Furthermore, not all AI agents are equal. If you're derisking a project that relies heavily on unit testing, it won't be enough to use agents that can only automate test creation at the class and method level. As a result, we'll see a shift in 2025 toward agents that can handle repository-level automation."

A key part of legacy modernization may be the move to desktop virtualization software solutions. "Looking into 2025, we'll see an increase in demand for cost efficiency coupled with business continuity and resiliency in virtual desktop solutions. These themes are echoed consistently by our customers who are asking questions like, 'How do I ensure my Microsoft Azure Virtual Desktops remain accessible at all times?' and 'How can I meet stringent continuity requirements?' among other challenges," so says Amol Dalvi, VP of product at Nerdio.

He further suggests that observability will become a cornerstone for businesses seeking to optimize the performance and user experience of their virtual desktop solutions. As more organizations adopt desktop-as-a-service, there will be a logical growth in the need for real-time insights into system health, user experience and any potential disruptive issues. Dalvi says that users are looking for tools that enable them to proactively identify performance bottlenecks and respond to them before they impact productivity.

Observably then - and the need the marketing people love to call 'actionable insights' and software and data engineers just call 'stuff to fix' - will continue to abound as we virtualize an increasingly wide span of the IT stack.

"Almost every company now relies in some way on technology and the software that underpins it to remain competitive. This adoption will continue in increasingly innovative ways, including the extensive adoption of artificial intelligence," said Matt Middleton-Leal, vice president for EMEA region at Qualys.

But he says, it is easy to overlook the legacy applications that will also remain in place and underpin many key services as we race to adopt new, more exciting capabilities.

"For the foreseeable future, businesses will have a mix of new and legacy software to maintain and manage. Knowing what is included and the associated risk within the software bill of materials and the associated hardware will be fundamental to staying secure. Shifting left from a security testing perspective will be the most effective strategy to ensure issues are captured prior to code going live across old and new systems," added Middleton-Leal.

Overall we can clearly say that as organizations grapple with AI and its dual role as both an enabler and disruptor, the software system management landscape is undergoing a seismic shift. The old playbook -- detecting problems at a patient zero (i.e. first signs of sickness, or in security terms the first signs of a vulnerability) and then reacting often won't cut it anymore. Today's bad actors are becoming more prolific using AI to create targeted smarter, autonomous threats that can easily slip past traditional defenses.

"The increasing sophistication of AI-powered malware renders legacy security models, like the traditional kill chain, obsolete. Attackers are using AI to generate endless malware variants capable of evading detection, leading to an exponential rise in patient zeros," says Infoblox CEO, Scott Harrell. "To stay ahead, organizations need to move beyond reactive defenses and embrace proactive, cyber security solutions. Domain name system [widely known as DNS, a technology that enables a hierarchical and distributed naming system service for computers and other IT resources] plays a critical role here i.e. it provides early visibility into adversaries' infrastructure, allowing organizations to identify and stop threats in advance of attackers creating and deploying new malware variants. Taking this kind of proactive stance isn't just smart; it's the future of how we will run our IT stacks."

But however much AI talk it out there, we should always remember that artificial intelligence is only as good as the data we feed it; as the old adage goes: garbage in, garbage out.

"At the points let's emphasize the fact that much of data's value (and whether it might rank as garbage-like) depends upon its timeliness i.e. the older it is, the less likely it is to be relevant," said Peter Pugh-Jones, director of financial services at Confluent. "With that in mind, it's the ability to deliver real-time data that makes data streaming such a hot commodity right now. Businesses can't achieve the potential that AI offers unless they lay the right foundations to fuel it - and those foundations must include access to an up-to-the-second understanding of their organizations."

Pugh-Jones works for a data streaming specialist, so clearly he's upbeat about the need to champion this topic. That said, he does point to the emergence of the data streaming engineer as a now more-formalized role that will serve to meet the needs of this space. As the hype cycle for AI starts to deflate and businesses continue to battle the AI skills gap, he thinks that we'll see an increasing appetite for infrastructure specialists solely focused on delivering a robust, compliant and real-time pipeline of data across businesses.

Will all the talk of next-generation this, new-age that and AI-driven everything, could an actual new iteration of software emerge?

"In 2025, the first software 2.0 applications will emerge. Software development and engineering is already being democratized with tools like code copilots. However, the transformation of enterprise software is just the beginning. Today, software workflows are the same whether they are executed the first time or the millionth time. New software will start to emerge in 2025 that will learn from usage... and without active coding, will improve user experience and productivity. This will be the beginning of a 20-year era of software transformation," said Indu Keri, general manager and head of engineering for Nutanix Hybrid Cloud.

Nutanix's Keri said "learn from usage" and that's key. Software development will now get an even more amplified set of tools to ensure that we don't have to reinvent the wheel. This will emerge (among other places) across the full spectrum of the Ops-operations portmanteau portfolio from FinOps (IT operations focused on cost), to DataOps (data-centric operations team tasks), to ModelOps (sysadmins and database administrators who concentrate on AI model status and health and not much else)... and onward to plain old DevOps as developers and operations teams unify for common goals.

Aside from the clinically definable world of software code and its evolutionary path, we'll also spend more time next year on developer, user and all-stakeholder wellbeing. As we use data analytics to more accurately assess how people react to the technology services that are actually running their lives from life sciences services to financial apps to games and entertainment, it will matter even more than analyse how users are feeling.

That way, what we build next year can be even better... and if that's not a New Year's resolution then what is?

Previous articleNext article

POPULAR CATEGORY

corporate

10156

tech

11426

entertainment

12476

research

5655

misc

13215

wellness

10047

athletics

13190