Effective September 2025, the EU Data Act aims to democratize data access across the EU by granting users of connected devices the right to access and share their data. SMEs will benefit from measures that promote a more favourable business environment. IT managers should adjust strategies to leverage these opportunities.
Businesses still rely on legacy storage because they are satisfied with their setup, do not want to disrupt operations, and want to avoid additional spending to change hardware that is not broken. However, legacy storage has compatibility and performance issues with modernized applications and high costs to maintain and replace. CIOs and IT leaders must upgrade their legacy storage to save costs and improve the efficiency of their applications.
Early adopters will use machine customers to conduct transactions semi-autonomously and autonomously for business by 2028 and 2032, respectively. If businesses selling goods and services continue to only focus on human customers then these machine customers will be missed and acquired by competitors. For retail businesses to stay competitive and increase profits, IT leaders and Customer Experience (CX) experts must plan to target machine customers.
Businesses are continuing to enhance their efficiency by using AI. This increases the need for LLMs that perform well on enterprise tasks. Fine-tuning is not a viable method because it is costly. Prompt caching (context caching) and Retrieval-Augmented Generation (RAG) are more suitable. AI engineers should read this article to learn more about these two methods to create cost-effective LLMs that perform well on their enterprise data.
Chiplet technology is transforming the semiconductor industry from monolithic to modular system-on-chip (SoC) designs. This approach enhances performance, scalability, and flexibility, offering significant advantages in various sectors. Business leaders should understand this emerging chip design trend and its potential applications in their industry as the technology evolves.
The widespread adoption of Generative AI (GenAI) in applications offers substantial advantages but also introduces various threats because of the myriad components they comprise. To ensure the integrity of AI/ML systems, organizations should manage every component through an AI Bill of Materials (AIBOM) to inventory the data, models, and infrastructure used.
Developers, data scientists, and security experts should advance their AI maturity by adopting AIBOMs to secure and optimize their AI systems.
Retailers are facing a surge in shoplifting–highlighting a need for advanced techniques to deter theft effectively. AI tools offer promising results to deter shoplifters and allow retailers to not only reduce theft but also improve overall operational efficiency and customer experience. Tech leaders must learn about the latest AI methods that will enhance their store’s security.
State-of-the-art AI models require powerful hardware to run and they have enormous file sizes–leading to high hosting and inference costs. SMEs are unable to integrate these models into their applications because of their limited budget. Sparsity is a technique that prunes a model’s parameters leading to a smaller, faster model that can run on-device. An SME’s IT team can use sparsity to integrate powerful models into their applications while reducing hosting and inference costs.
It has become easier to create AI applications due to the ease of integration by using APIs. High cost is one challenge when frequent API calls are made to LLMs with similar content to add context. Prompt caching, or context caching, creates a cache to solve this challenge. AI engineers must use prompt caching to decrease inference fees and reduce latency.
High-quality data is an important requirement for a successful AI model. Meta’s Llama 2 and 3 were trained on 2 and 15 trillion tokens, respectively. One method of obtaining data is through web scraping. This has led to many businesses suing AI developers for using copyrighted material. IT leaders and Content Strategists can prevent their data from being scraped by blocking web crawlers that gather data for AI training.