For exploring large, complex data sets, nothing matches the power of interactive visualizations that let users directly manipulate data and arrange it in new ways.
But with data sets of millions or billions of data points, interactivity is hard to achieve. It’s hindered by the sheer number of computations required to return a query result, and by the latency inherent in accessing data so big it requires disk storage. AT&T researchers realized the best way to maintain interactivity was to move all data into main memory. How to do that was the challenge, and it required a new type of data structure: the nanocube. Read more.
The Best Open Source Big Data Tools: RCloud, from AT&T Labs, was created to address the need for a collaborative data analysis environment for R. RCloud allows researchers to analyze large data sets and share their results across an organization. For example, one group of data scientists might use RCloud to document the data workflow for semantic analysis of Web documents. This notebook could be annotated and reused by the machine learning group in the company.
AT&T Labs and Cask announced a developer preview of the open source Tigon project, a real-time streaming analytics framework for Hadoop based on technology contributed by both companies. Tigon offers a simple, efficient and cost-effective way for developers to create a diverse range of apps that address a broad set of use cases such as network intrusion detection and analytics, social media market analysis, location analytics, and real-time recommendation engines.
The AT&T Future Star Sigcomm Travel award is to encourage and support students who have serious interest in computer networking and telecommunication and work on research topics towards solving real world problems and challenges to attend the Sigcomm conference.
Parallel hybrid heuristics for the permutation flow shop problem
Mauricio Resende, Fed. U. of Minas Gerais Martin G. Ravetti, U. of Newcastle Carlos Riveros, U. of Newcastle Alexandre Mendes, Panos M. Pardalos