Cloud services and private networks, which for years have been designed to cope with data measured in gigabytes and terabytes, will need to be adapted in the months and years to come to deal with petabytes and exabytes. Some companies, including CoreWeave and VAST Data, have already begun trying to develop the next generation of infrastructure. Data storage historically has been organized in tiers, in which recent, high-priority data was kept readily accessible and older data was buried further down, according to VAST chief executive Renen Hallak. However, this is less feasible now, he added, stating: “Once you have a good AI model you want to infer on all of your history, because that’s how you get value out of it. And then as you get more information you want to retrain and build a better model. You read data over and over and over again across many petabytes and in some cases exabytes. And so that’s a very different problem.” While traditional systems expand by adding nodes that store a slice of the larger data set, VAST allows all nodes access to all of the information at once, improving scalability, speed and resiliency. VAST also unbundles the price of data storage and computing, which it says saves money.
Leave a Reply