This site uses cookies to improve your experience. To help us insure we adhere to various privacy regulations, please select your country/region of residence. If you do not select a country, we will assume you are from the United States. Select your Cookie Settings or view our Privacy Policy and Terms of Use.
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Used for the proper function of the website
Used for monitoring website traffic and interactions
Cookie Settings
Cookies and similar technologies are used on this website for proper function of the website, for tracking performance analytics and for marketing purposes. We and some of our third-party providers may use cookie data for various purposes. Please review the cookie settings below and choose your preference.
Strictly Necessary: Used for the proper function of the website
Performance/Analytics: Used for monitoring website traffic and interactions
Dataarchitecture definition Dataarchitecture describes the structure of an organizations logical and physical data assets, and data management resources, according to The Open Group Architecture Framework (TOGAF). An organizations dataarchitecture is the purview of data architects.
The chipmaker has released a series of what it calls Enterprise Reference Architectures (Enterprise RA), which are blueprints to simplify the building of AI-oriented data centers. Building an AI-oriented data center is no easy task, even by data center construction standards.
From customer service chatbots to marketing teams analyzing call center data, the majority of enterprises—about 90% according to recent data —have begun exploring AI. For companies investing in data science, realizing the return on these investments requires embedding AI deeply into business processes.
In a global economy where innovators increasingly win big, too many enterprises are stymied by legacy application systems. Modernising with GenAI Modernising the application stack is therefore critical and, increasingly, businesses see GenAI as the key to success. The solutionGenAIis also the beneficiary.
Every data-driven project calls for a review of your dataarchitecture—and that includes embedded analytics. Before you add new dashboards and reports to your application, you need to evaluate your dataarchitecture with analytics in mind. 9 questions to ask yourself when planning your ideal architecture.
Microsoft is describing AI agents as the new applications for an AI-powered world. In our real-world case study, we needed a system that would create test data. This data would be utilized for different types of application testing. There can be up to eight different data sets or files.
New research from IBM finds that enterprises are further along in deploying AI applications on the big iron than might be expected: 78% of IT executives surveyed said their organizations are either piloting projects or operationalizing initiatives that incorporate AI technology.
Broadcom on Tuesday released VMware Tanzu Data Services, a new “advanced service” for VMware Cloud Foundation (VCF), at VMware Explore Barcelona. VMware Tanzu for MySQL: “The classic web application backend that optimizes transactional data handling for cloud native environments.”
After all, a low-risk annoyance in a key application can become a sizable boulder when the app requires modernization to support a digital transformation initiative. Accenture reports that the top three sources of technical debt are enterprise applications, AI, and enterprise architecture.
Enterprise architecture (EA) has evolved beyond governance and documentation. Establish clear roles and responsibilities for an integrated team of business, application, data and technology architects. Ensure architecture insights drive business strategy. Accelerate transformation by enabling rapid decision-making.
As data centers evolve from traditional compute and storage facilities into AI powerhouses, the demand for qualified professionals continues to grow exponentially and salaries are high. The rise of AI, in particular, is dramatically reshaping the technology industry, and data centers are at the epicenter of the changes.
Cisco is boosting network density support for its data center switch and router portfolio as it works to deliver the network infrastructure its customers need for cloud architecture, AI workloads and high-performance computing. Cisco’s Nexus 9000 data center switches are a core component of the vendor’s enterprise AI offerings.
HPE claims that this approach effectively reduces the required data center floor space by 50% and reduces the cooling power necessary per server blade by 37%. “As Data centers warm up to liquid cooling : AI, machine learning, and high-performance computing are creating cooling challenges for data center owners and operators.
Speaker: Ahmad Jubran, Cloud Product Innovation Consultant
Many do this by simply replicating their current architectures in the cloud. Those previous architectures, which were optimized for transactional systems, aren't well-suited for the new age of AI. In this webinar, you will learn how to: Take advantage of serverless applicationarchitecture. And much more!
All industries and modern applications are undergoing rapid transformation powered by advances in accelerated computing, deep learning, and artificial intelligence. The next phase of this transformation requires an intelligent data infrastructure that can bring AI closer to enterprise data. Imagine that you’re a data engineer.
For starters, generative AI capabilities will improve how enterprise IT teams deploy and manage their SD-WAN architecture. With AI-driven network management and optimization capabilities, enterprises will be able to prioritize traffic and application performance based on user needs and business requirements, according to IDC.
Fortinet is expanding its data loss prevention (DLP) capabilities with the launch of its new AI-powered FortiDLP products. The FortiDLP platform provides automated data movement tracking, cloud application monitoring and endpoint protection mechanisms that work both online and offline.
Zero Trust architecture was created to solve the limitations of legacy security architectures. It’s the opposite of a firewall and VPN architecture, where once on the corporate network everyone and everything is trusted. In today’s digital age, cybersecurity is no longer an option but a necessity.
Last year, Nvidias GTC 2024 grabbed headlines with its introduction of the Blackwell architecture and the DGX systems powered by it. Expect GTC 2025 to further solidify Nvidias position as an AI leader as it showcases practical applications of generative AI, moving beyond theoretical concepts to real-world implementations.
Migration to the cloud, data valorization, and development of e-commerce are areas where rubber sole manufacturer Vibram has transformed its business as it opens up to new markets. Data is the heart of our business, and its centralization has been fundamental for the group,” says Emmelibri CIO Luca Paleari.
Cisco has unwrapped a new family of data center switches it says will help customers more securely support large workloads and facilitate AI development across the enterprise. The first major service these DPUs will perform on the switch will be Layer 4 stateful segmentation throughCiscos Hypershield security architecture.
Supermicro announced the launch of a new storage system optimized for AI workloads using multiple Nvidia BlueField-3 data processing units (DPU) combined with an all-flash array. The Supermicro JBOF replaces the traditional storage CPU and memory subsystem with the BlueField-3 DPU and runs the storage application on the DPU’s 16 Arm cores.
Legacy platforms meaning IT applications and platforms that businesses implemented decades ago, and which still power production workloads are what you might call the third rail of IT estates. The first is migrating data and workloads off of legacy platforms entirely and rehosting them in new environments, like the public cloud.
VMware by Broadcom has unveiled a new networking architecture that it says will improve the performance and security of distributed artificial intelligence (AI) — using AI and machine learning (ML) to do so. Each stage of edge technology evolution is capable of transforming a variety of industries,” the report noted.
Lightmatter has announced new silicon photonics products that could dramatically speed up AI systems by solving a critical problem: the sluggish connections between AI chips in data centers. Todays AI chips often sit idle waiting for data to arrive, wasting computing resources and slowing down results.
Later, as an enterprise architect in consumer-packaged goods, I could no longer realistically contemplate a world where IT could execute mass application portfolio migrations from data centers to cloud and SaaS-based applications and survive the cost, risk and time-to-market implications.
To keep up, IT must be able to rapidly design and deliver applicationarchitectures that not only meet the business needs of the company but also meet data recovery and compliance mandates. It’s a tall order, because as technologies, business needs, and applications change, so must the environments where they are deployed.
The road ahead for IT leaders in turning the promise of generative AI into business value remains steep and daunting, but the key components of the gen AI roadmap — data, platform, and skills — are evolving and becoming better defined. But that’s only structured data, she emphasized. MIT event, moderated by Lan Guan, CAIO at Accenture.
Kyndryl and Google Cloud are expanding their partnership to help customers use generative AI to move data off the mainframe and into the cloud. The package supports features such as COBOL-to-Java application coding assistance, and it enables AI training using customer on-premise data, according to Kyndryl.
In that report, we analyzed how the cloud-native ecosystem, driven by open-source software (OSS), has been powering architecture modernization across infrastructure and application development, enabling platform-driven innovation in the meantime across a spectrum of technology domains such as data […]
Two things play an essential role in a firm’s ability to adapt successfully: its data and its applications. Which is why modernising applications is so important, especially for traditional businesses – they need to keep pace with the challenges facing trade and commerce nowadays. That’s why the issue is so important today.
The MI325X uses AMD’s CDNA 3 architecture, which the MI300X also uses. CDNA 3 is based on the gaming graphics card RDNA architecture but is expressly designed for use in data center applications like generative AI and high-performance computing. The FP6 is new and unique to AMD.
Nvidia has partnered with hardware infrastructure vendor Vertiv to provide liquid cooling designs for future data centers designed to be AI factories. AI factories are specified data centers emphasizing AI applications as opposed to traditional line of business applications like databases and ERP.
Attackers are using encrypted channels to bypass traditional defenses, concealing malware, phishing campaigns, cryptomining/cryptojacking, and data theft within encrypted traffic. Zscaler eliminates this risk and the attack surface by keeping applications and services invisible to the internet. The report offers examples of each.
As 2025 kicked off, I wrote a column about the network vendor landscape specifically, which networking players will step up and put their efforts into finding new applications with solid business benefits that could enable a network transformation. Its not an application, but an applicationarchitecture or model.
As someone deeply involved in shaping data strategy, governance and analytics for organizations, Im constantly working on everything from defining data vision to building high-performing data teams. My work centers around enabling businesses to leverage data for better decision-making and driving impactful change.
5 key findings: AI usage and threat trends The ThreatLabz research team analyzed activity from over 800 known AI/ML applications between February and December 2024. The surge was fueled by ChatGPT, Microsoft Copilot, Grammarly, and other generative AI tools, which accounted for the majority of AI-related traffic from known applications.
The built-in elasticity in serverless computing architecture makes it particularly appealing for unpredictable workloads and amplifies developers productivity by letting developers focus on writing code and optimizing application design industry benchmarks , providing additional justification for this hypothesis. Vendor lock-in.
also supports HPEs Data Fabric architecture which aims supply a unified and consistent data layer that allows data access across premises data centers, public clouds, and edge environments with the idea of bringing together a single, logical view of data, regardless of where it resides, according to HPE.
The growing demand for real-time data to power AI applications is compelling businesses to reevaluate their traditional dataarchitectures. Legacy systems typically rely on separate platforms for transactional and analytical processing, leading to inefficiencies and delayed insights.
Why SD-WAN is still critical to the enterprise SD-WAN connects users, applications, and data across locations within a hybrid environment. To accommodate increased traffic, more edges, and dispersed users, modern SD-WAN solutions need to have top-of-the-line performance, resiliency, scalability, and autonomous architecture options.
They are using the considerable power of this fast-evolving technology to tackle the common challenges of cloud modernization, particularly in projects that involve the migration and modernization of legacy applications a key enabler of digital and business transformation. In this context, GenAI can be used to speed up release times.
Considerable amounts of data are collected on the edge. Edge servers do the job of culling the useless data and sending only the necessary data back to data centers for processing. Liquid cooling gains ground: Liquid cooling is inching its way in from the fringes into the mainstream of data center infrastructure.
We organize all of the trending information in your field so you don't have to. Join 83,000+ users and stay up to date on the latest articles your peers are reading.
You know about us, now we want to get to know you!
Let's personalize your content
Let's get even more personalized
We recognize your account from another site in our network, please click 'Send Email' below to continue with verifying your account and setting a password.
Let's personalize your content