Big Data is a popular buzzword in computer science and with good reason. The analysis of large data sets is both a difficult problem and one with a wide range of applications; the selection of ads for blogs based on their content and the user's activity is just one example.
But traditional Big Data systems, like Google Cloud, are designed for more traditional database applications. They aren't built to handle time-oriented data. Cyber-physical systems use time as a fundamental concept. Time-series data and signals are two terms for this concept.
Of course, entire fields (e.g., signal processing) have sprung up to develop the mathematics of signal processing. But the design of large computing systems that can efficiently handle time-series data has lagged behind.
That's where Big Signals comes in. We need cloud computing systems that are designed to manage signals and time-series data. We process signals in a different way than we do, for example, sales transactions. Cloud systems that operate on signal-oriented data will want to process small windows of signals in real time to identify important events; they will also want to analyze historical data at multiple time scales in order to identify larger trends.
Here are a few examples of how to use Big Signals in the cloud. Farmers may use historical weather data to figure out how to plant, water, and feed their crops. Medical teams may use cloud-based systems both to monitor a patient's current state as well as to run longer-term analyses for diagnosis. Energy systems may use historical load data to manage energy generation; it can also use historical weather data to predict the generation capacity available from wind and solar.
The existing cloud computing systems are a good start, but we need to understand how data schema, access scheduling, and other problems to handle the challenges of Big Signals.