AI Servers: The use of two or more AI servers working together to increase performance, capacity, or reliability.
In the News: A Checklist for building AI Workstations.
Related Terms: ai, ai technology, cognitive technology, artificial intelligence and machine learning
AI Workstation: a special desktop computer designed for technical or scientific applications. Intended primarily to be used by one person at a time, they are commonly connected to a local area network and run multi-user operating systems.
Basic Input/Output System (BIOS): firmware used to perform hardware initialization during the booting process, and to provide runtime services for operating systems and programs.
Big Data: A field that treats ways to analyze, systematically extract information from, or otherwise deal with data sets that are too large or complex to be dealt with by traditional data-processing application software. Learn More.
Related Terms: big data cluster, big data server
Database Server: a server which uses a database application that provides database services to other computer programs or to computers, as defined by the client–server model. Learn More.
Data Visualization: an interdisciplinary field that deals with the graphic representation of data. It is a particularly efficient way of communicating when the data is numerous as for example a Time Series.
Data Lake: a system or repository of data stored in its natural/raw format, usually object blobs or files.
Deep Learning: Part of a broader family of machine learning methods based on artificial neural networks with representation learning. Learning can be supervised, semi-supervised or unsupervised.
In the News: Tips to Building an AI GPU Cluster
Edge Computing: a distributed computing paradigm that brings computation and data storage closer to the location where it is needed to improve response times and save bandwidth.
Enterprise Data Lake: a collaborative self-service big data discovery and preparation solution for data analysts and data scientists. It enables analysts to rapidly discover and turn raw data into insight and allows IT to ensure quality, visibility, and governance.
HDFS: a distributed file system that handles large data sets running on commodity hardware. It is used to scale a single Apache Hadoop cluster to hundreds (and even thousands) of nodes. HDFS is one of the major components of Apache Hadoop, the others being MapReduce and YARN.
HPC or High Performance Computing: The practice of aggregating computing power in a way that delivers much higher performance than one could get out of a typical desktop computer or workstation in order to solve large problems in science, engineering, or business.
In the News: Emerging Alternatives for HPC. | Learn More.
Related Terms: high performance computing, HPC server, HPC cluster, HPC storage, HPC applications, high performance computing cluster, computing cluster
Hyperthreading: Intel’s proprietary simultaneous multithreading implementation used to improve parallelization of computations performed on x86 microprocessors.
Internet of Things (IoT): describes the network of physical objects—a.k.a. “things”—that are embedded with sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the Internet.
IPMI: A set of computer interface specifications for an autonomous computer subsystem that provides management and monitoring capabilities independently of the host system’s CPU, firmware (BIOS or UEFI) and operating system.
Kubernetes: An open-source container-orchestration system for automating computer application deployment, scaling, and management.
In the News: On-Premise Kubernetes Brings Cloud Computing Benefits to the Data Center.
Liquid Cooling: The reduction of heat in electronic and mechanical devices through exploiting the properties of liquids.
Machine Learning: The study of computer algorithms that improve automatically through experience and by the use of data. It is seen as a part of artificial intelligence.
In the News: Scaling Machine Learning Systems | Learn More.
Metadata: a set of data that describes and gives information about other data.
Multi-Core Processor: a computer processor on a single integrated circuit with two or more separate processing units, called cores, each of which reads and executes program instructions
NoSQL Database: Provides a mechanism for storage and retrieval of data that is modeled in means other than the tabular relations used in relational databases.
Open Source Software: a type of computer software in which source code is released under a license in which the copyright holder grants users the rights to use, study, change, and distribute the software to anyone and for any purpose.
Parallel File Systems: a type of distributed file system that distributes file data across multiple servers and provides for concurrent access by multiple tasks of a parallel application.
Predictive Analytics: the use of data, statistical algorithms and machine learning techniques to identify the likelihood of future outcomes based on historical data.
Scale Out Storage: a network-attached storage (NAS) architecture in which the total amount of disk space can be expanded through the addition of devices in connected arrays with their own resources. In a scale–out system, new hardware can be added and configured as the need arises.
SQL Database: a domain-specific language used in programming and designed for managing data held in a relational database management system, or for stream processing in a relational data stream management system.
Storage Cluster: The use of two or more storage servers working together to increase performance, capacity, or reliability.
In the News: Building a Storage Cluster for University Research & More | Learn More.
Storage Server: are used to store the data and applications of an organization. They serve as store houses for data and applications repositories. Learn More.
Tiered Storage: a system or method for assigning data to various types of storage media based on a range of requirements for cost, availability, performance, and recovery.
Virtualization: the act of creating a virtual version of something, including virtual computer hardware platforms, storage devices, and computer network resources.
INDUSTRY-SPECIFIC TECHNICAL GLOSSARY
Enterprise Search Appliance: Software targeted to mid- to large-sized businesses that need fast, inexpensive search solution to index websites, file systems, or achieves. Learn More.
High Throughput Genomics: The automation of genomic experiments or tests allowing for large-scale repetition. Learn More.
Related terms: DNA sequencing
Range Safety Analysis: In the field of rocketry, range safety may be assured by a system which is intended to protect people and assets on both the rocket range and downrange in cases when a launch vehicle might endanger them. Learn More.
Streaming Analytics: The processing and analyzing of data on a continuous loop rather than in segments. Learn More.
Weather Modeling: the use of systems of differential equations based on the laws of physics, which are in detail fluid motion, thermodynamics, radiative transfer, and chemistry, and use a coordinate system which divides the planet into a 3D grid.
In The News: How Technology Can Reduce Wildfire Risks | Learn More.