This site uses cookies to improve your experience. To help us insure we adhere to various privacy regulations, please select your country/region of residence. If you do not select a country, we will assume you are from the United States. Select your Cookie Settings or view our Privacy Policy and Terms of Use.
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Used for the proper function of the website
Used for monitoring website traffic and interactions
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Strictly Necessary: Used for the proper function of the website
Performance/Analytics: Used for monitoring website traffic and interactions
Also, as part of the partnership, Veeam will integrate Microsoft AI services and machine learning (ML) capabilities into its data resilience platform, Veeam Data Cloud. This allows organizations to bring down costs and simplify data management. This partnership signals a shift in data resiliency and its importance.
The five fastest-growing hubs for data center expansion include an interesting mix of urban areas that have one thing in common: lots of available power. Based on projected data-center capacity growth, Las Vegas/Reno is the No. We’re only at the start of the AI boom, so data center growth isn’t going to slow down anytime soon.
As data centers evolve from traditional compute and storage facilities into AI powerhouses, the demand for qualified professionals continues to grow exponentially and salaries are high. The rise of AI, in particular, is dramatically reshaping the technology industry, and data centers are at the epicenter of the changes.
Data centers this year will face several challenges as the demand for artificial intelligence introduces an evolution in AI hardware, on-premises and cloud-based strategies for training and inference, and innovations in power distributionsall while opposition to new data center developments continues to grow.
This GEP-sponsored report will show you how to leverage data for a collaborative supply chain that delivers results and how to future-proof supply chain management strategies. The C-suite is laser-focused on supply chain performance.
Even as demand for data infrastructure surges to an all-time high, Equinix is planning to lay off 3% of its workforce, suggesting a growing skills mismatch in the industry. According to Goldman Sachs , data center demand in the US alone is projected to nearly triple by 2030, driving more than $1 trillion in investment.
Data architecture definition Data architecture describes the structure of an organizations logical and physical data assets, and data management resources, according to The Open Group Architecture Framework (TOGAF). An organizations data architecture is the purview of data architects. Curate the data.
Solidigm, a provider of NAND flash solid-state drives, and memory giant Micron have each introduced very high-capacity SSD drives meant for enterprise data centers, particularly in AI use cases. Solidigm intros 122TB PCIe SSD Solidigm unveiled its highest capacity PCIe drive yet, the 122TB Solidigm D5-P5336 data center SSD.
Microsoft has introduced a new design for data centers to optimize artificial intelligence (AI) workloads, implementing a cooling system that it claims will consume zero water. Traditionally in Microsoft data centers, water has been evaporated on-site to reduce the power demand of the cooling systems. Many also use air cooling systems.
Speaker: Shreya Rajpal, Co-Founder and CEO at Guardrails AI & Travis Addair, Co-Founder and CTO at Predibase
However, productionizing LLMs comes with a unique set of challenges such as model brittleness, total cost of ownership, data governance and privacy, and the need for consistent, accurate outputs.
Once the province of the data warehouse team, data management has increasingly become a C-suite priority, with data quality seen as key for both customer experience and business performance. But along with siloed data and compliance concerns , poor data quality is holding back enterprise AI projects.
Massive global demand for AI technology is causing data centers to increase spending on servers, power, and cooling infrastructure. As a result, data center CapEx spending will hit $1.1 As a result, just four companies Amazon, Google, Meta, and Microsoft will account for nearly half of global data center capex this year, he says.
Business leaders may be confident that their organizations data is ready for AI, but IT workers tell a much different story, with most spending hours each day massaging the data into shape. Theres a perspective that well just throw a bunch of data at the AI, and itll solve all of our problems, he says.
With all the focus on power consumption, the water consumption of data centers has been somewhat overlooked. Data centers are already wearing out their welcome in certain areas. Data centers need so much water because it is used in a variety of functions. I believe some data center operators just bowed out, said Howard.
New data is raising the bar on how to attract and retain talent in the New World of Work. We have distilled the data into actionable insights you can leverage to strengthen your position as an employer of choice and access the skills you need.
Palo Alto Networks is teaming with NTT Data to allow the global IT services company to offer an enterprise security service with continuous threat monitoring, detection and response capabilities. NTT Data’s MXDR service offers 24×7 incident detection and response and AI-driven threat intelligence orchestration and automation, Mehta stated.
In 2018, I wrote an article asking, “Will your company be valued by its price-to-data ratio?” The premise was that enterprises needed to secure their critical data more stringently in the wake of data hacks and emerging AI processes. Data theft leads to financial losses, reputational damage, and more.
According to a report released this week by Bloom Energy, US data centers will need 55 gigawatts of new power capacity within the next five years. The report , based on a survey of 100 data center leaders, also shows that 30% of all sites will be using onsite power by 2030.
The construction of massive data center campuses is booming, with hyperscalers, colocation providers and large enterprises developing new capacity to support the exploding requirements of AI. These are not normal times There has always been growth in data center capacity but never anything like this. TWh to 162.5
In response to these challenges, a leading heavy equipment manufacturer selected GEP to redesign its source-to-contract processes and implement a convergent data model to help manage procurement data across its multiple locations.
From customer service chatbots to marketing teams analyzing call center data, the majority of enterprises—about 90% according to recent data —have begun exploring AI. For companies investing in data science, realizing the return on these investments requires embedding AI deeply into business processes.
Cisco is boosting network density support for its data center switch and router portfolio as it works to deliver the network infrastructure its customers need for cloud architecture, AI workloads and high-performance computing. Cisco’s Nexus 9000 data center switches are a core component of the vendor’s enterprise AI offerings.
In 2019, Gartner analyst Dave Cappuccio issued the headline-grabbing prediction that by 2025, 80% of enterprises will have shut down their traditional data centers and moved everything to the cloud. The enterprise data center is here to stay. Six years ago, nearly 60% of data center capacity was on-premises; thats down to 37% in 2024.
On the demand side for data centers, large hyperscale cloud providers and other corporations are building increasingly bigger large language models (LLMs) that must be trained on massive compute clusters. Still, several questions remain about DeepSeeks training, infrastructure, and ability to scale, Schneider stated.
With detailed pay rate data for top IT positions like Cybersecurity Consultants, Cloud Engineers, and Salesforce Developers, this guide is an essential resource for companies looking to stay competitive in today’s evolving workforce landscape.
Space supply in major data center markets increased by 34% year-over-year to 6,922.6 Data Center headwinds The growth comes despite considerable headwinds facing data center operators, including higher construction costs, equipment pricing, and persistent shortages in critical materials like generators, chillers and transformers, CRBE stated.
A growing number of buyers have reported purchasing supposedly new Seagate data center-grade hard drives, only to discover that they had been previously used for thousands of hours. The fraudulent sales first came to light in January when buyers began inspecting their newly purchased Seagate Exos data center-grade HDDs.
The products that Klein particularly emphasized at this roundtable were SAP Business Data Cloud and Joule. Business Data Cloud, released in February , is designed to integrate and manage SAP data and external data not stored in SAP to enhance AI and advanced analytics.
As someone deeply involved in shaping data strategy, governance and analytics for organizations, Im constantly working on everything from defining data vision to building high-performing data teams. My work centers around enabling businesses to leverage data for better decision-making and driving impactful change.
From new pricing strategies and material substitutability to alternative suppliers and stockpiling, a new GEP-commissioned Economist Impact report reveals that enterprises are adopting a variety of approaches underpinned by data and technology.
A report by TD Cowen indicating that Microsoft is reportedly slowing down or possibly redirecting its data center construction plans is tied to the company potentially being in an oversupply position, analysts with the brokerage say. Over the past four years, he said, Microsoft has been leasing more data centers than owning.
Help from a hub Understanding this reality, Corporate One created a data orchestration hub that allows different cores to connect to other services. By bringing different core technologies together, this data orchestration hub removes the need for this authentication because the different core technologies are connected.
In line with this, we understood that the more real-time insights and data we had available across our rapidly growing portfolio of properties, the more efficient we could be, she adds. Off-the-shelf solutions simply didnt offer the level of flexibility and integration we required to make real-time, data-driven decisions, she says.
The chief architect for Intels Xeon server processors has defected to chip rival Qualcomm, which is making yet another run at entering the data center market. If Intel was hoping for a turnaround in 2025, it will have to wait at least a little bit longer.
Highlighting innovations like AI-driven tools and data analytics, the playbook empowers leaders to streamline processes, enhance candidate experiences , and foster diversity and inclusion. The 2025 Recruitment Playbook by Procom is a strategic guide tailored for hiring managers to navigate the evolving talent landscape.
Its conceptually similar to how enterprises developed digital value chains that enabled data to infuse digital experiences, at pace and scale, in order to increase their value. You want your genAI app developers to be able to build access to the right data sources into tailored enterprise apps, which we represent with the diagram below.
AMDs Secure Encrypted Virtualization (SEV), meant to protect processor memory from prying eyes in virtual machine (VM) environments, can be tricked into giving access to its encrypted memory contents using a test rig costing less than $10, researchers have revealed.
He will be replaced by Justin Hotard, who is currently Intels Chief Data Center Officer and has previously held executive positions at technology companies such as Hewlett Packard Enterprise and NCR Corporation. Nokia has announced that CEO Pekka Lundmark will step down. He took up the position in 2020.
At issue is how third-party software is allowed access to data within SAP systems. The reason: Sharing data from the SAP system with third-party solutions is subject to excessive fees. The reason: Sharing data from the SAP system with third-party solutions is subject to excessive fees. But SAP and its customers benefited.
Speaker: Maher Hanafi, VP of Engineering at Betterworks & Tony Karrer, CTO at Aggregage
He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use. . 💡 This new webinar featuring Maher Hanafi, VP of Engineering at Betterworks, will explore a practical framework to transform Generative AI prototypes into impactful products!
HPE claims that this approach effectively reduces the required data center floor space by 50% and reduces the cooling power necessary per server blade by 37%. “As Data centers warm up to liquid cooling : AI, machine learning, and high-performance computing are creating cooling challenges for data center owners and operators.
The transceiver modulates light waves that transmit data, making it much more efficient than copper. In large-scale AI data centers with thousands of switches, this equates to hundreds of thousands or even millions of transceivers. AI has transformed the way data centers are being designed.
Fortinet is expanding its data loss prevention (DLP) capabilities with the launch of its new AI-powered FortiDLP products. The FortiDLP platform provides automated data movement tracking, cloud application monitoring and endpoint protection mechanisms that work both online and offline.
As data volumes increase exponentially, the threat of data loss from cyberattacks, human error, or system failure has never been greater — which is why, in 2025, fortifying a data protection strategy has never made more sense. Backups continue to serve as the backbone of any data protection solution.
What’s Inside: How CPOs are driving strategic decision-making and technology adoption The top priorities and challenges for procurement in 2025 Why AI, sustainability, and data analytics are essential for success Read this essential report to chart your path forward and influence procurement tools and processes.
We organize all of the trending information in your field so you don't have to. Join 83,000+ users and stay up to date on the latest articles your peers are reading.
You know about us, now we want to get to know you!
Let's personalize your content
Let's get even more personalized
We recognize your account from another site in our network, please click 'Send Email' below to continue with verifying your account and setting a password.
Let's personalize your content