Micron Technology announced a $100 million venture fund on Wednesday that will invest in artificial intelligence startups. The company is betting that broader use of the technology will increase sales of the company’s memory chips in data centers where the algorithms are trained and the systems where they are used to spot imperfections in products on a manufacturing line or to understand human speech.
“These trends are at the heart of the biggest opportunities in front of us, and increasingly require memory and storage technologies to turn vast amounts of data into insights,” said Sanjay Mehrotra, the company’s chief executive, in a statement. The neural networks at the heart of machine learning are growing more and more complicated. Handling them means using not only more compute but also more memory and storage.
Micron estimates that machine learning training workloads require six times more DRAM per server and double the solid state drive capacity than a standard cloud server. The company expects that servers designed for machine learning to account for almost half of all shipments in 2025. Server shipments increased 20.5 percent year over year to 2.9 million units in the second quarter of 2018, according to market researcher IDC.
High bandwidth is the other major requirement. Machine learning models are too large for the memory caches close to the main computer processor. The algorithms must be stored instead inside separate memory that feeds the processor with data. Performance is constrained by how long it takes to get information from memory. The number of available memory channels and how much information can be forced through them plays into that.
High bandwidth memory—more commonly known as HBM—lessens the memory bottleneck. The technology comes in cubes of stacked DRAM with an integrated controller that increase speed while reducing size, latency and power. Micron has struggled to stay competitive with its unique hybrid memory cubes (HMC) technology. But the company plans to enter the high bandwidth memory market before the end of 2019.
HBM—which is currently manufactured by Samsung Electronics and South Korea’s Hynix— is supposed to consume less power while occupying less space than Micron’s GDDR memory chips. The high bandwidth DRAM is designed to be used with graphics processors that are the current gold standard in machine learning. Micron recently entered volume production with its 8Gb GDDR6 memory, targeting 64GB per second in one package.
The industrial and automotive sectors also have high memory standards. The increasing use of cameras, radar and other sensors to enable semi-autonomous cars means more information flowing through them, driving demand for memory chips with more bandwidth, latency and throughput. Micron’s $3 billion investment in the American factory where it builds long lifecycle products was aimed at these applications.
The Boise, Idaho-based company is tapping into the Internet of Things to protect its business from sudden fluctuations in the PC and smartphone markets. For much of the last two years, consumer electronics sales have propped up prices and the fortunes of Micron Technology. But Mehrotra has said that the market is “structurally different” than it once was, with more applications than ever needing the company’s chips.
DRAM manufacturers have been operating their factories at almost full capacity and adding new factory lines to tap into the prolonged boom. The question is whether they are carelessly boosting production. Capital expenditures targeting DRAM jumped 81 percent to $16.3 billion last year and is projected to climb another 40 percent to $22.9 billion before the end of the year, according to market researcher IC Insights.
The costs of ramping up each new generation of memory are growing. Bit volume increases that once came with the transition to a new technology node have slowed. Manufacturing DRAM with 20-nanometer or more advanced processes requires 80 percent more production space per wafer, according to Micron Technology. The company expects its 18-nanometer DRAM production to exceed that of previous generations before the end of 2018.
Despite this, higher capital expenditures have touched off concerns that an overwhelming flood of new capacity could drive down memory prices. Memory chips are not unlike other commodities like steel and solar panels since their prices change based on supply and demand. Computer memory prices jumped 96 percent over the last year, according to IC Insights. Buyers were paying 35 percent more for DRAM this August compared to last August.
The cost of storage also continues to decline. NAND prices are projected to drop around 10 to 15 percent in the fourth quarter and decline another 25 to 30 percent next year, according to market tracker DRAMeXchange. One major factor is that higher density devices are entering volume production. Micron Technology, for instance, has 96-layer 3D NAND technology on track for volume shipments in the second half of 2018.