When big data gets too big, this machine-learning algorithm may be the answer


Big data may hold a world of untapped potential, but what happens when your data set is bigger than your processing power can handle? A new algorithm that taps quantum computing may be able to help.

That’s according to researchers from MIT, the University of Waterloo and the University of Southern California who published a paper Monday describing a new approach to handling massively complex problems.

Topology focuses on properties that stay the same even when something is bent and stretched, and it’s particularly useful for analyzing the connections in complex networks such as the U.S. power grid or the global interconnections of the Internet. It can also help zero in on the most important features of a massive set of data.

Read the source article at InfoWorld