Definitions[]
Data[]
An aggregator is
“ | a software implementation based on mathematical function(s) that transforms groups of raw data into intermediate, aggregated data. Raw data can come from any source. Aggregators help in managing 'big' data.[1] | ” |
Internet[]
An aggregator is client software or a Web application which aggregates syndicated web content such as news headlines, blogs, podcasts, and vlogs in a single location for easy viewing.
Overview (Data)[]
Basic properties, assumptions, recommendations, and general statements about aggregator include:
- 1. Aggregators may be virtual due the benefit of changing implementations quickly and increased malleability. A situation may exist where aggregators are physically manufactured, e.g., a field-programmable gate array (FPGA) or hard-coded aggregator that is not programmable. Aggregators may also act in a similar way as n-version voters.
- 2. Aggregators require computing horsepower, however this assumption can be relaxed by changing the definition and assumption of virtual to physical, e.g. firmware, microcontroller or microprocessor. For example, aggregators could execute on faster hardware such as a smartphone. Aggregators will likely use weights . . . to compute intermediate, aggregated data.
- 3. Aggregators have two actors for consolidating large volumes of data into lesser amounts: Clusters . . . and Weights. . . . Aggregators process big data concerns within NoTs, and to satisfy this role computational "performance enhancing" technologies will be needed. This is the only primitive with actors.
- 4. Sensors may communicate directly with other sensors, and thus act in some situations quite similar to aggregators.
- 5. Intermediate, aggregated data may suffer from some level of information loss. Proper care in the aggregation process should be given to significant digits, rounding, averaging, and other arithmetic operations to avoid unnecessary loss of precision.
- 6. For each cluster . . . there should be an aggregator or set of potential aggregators.
- 7. Aggregators are: (1) executed at a specific time and for a fixed time interval, or (2) event-driven.
- 8. Aggregators may be acquired off-the-shelf. Note that aggregators may be non-existent and will need to be home-grown. This may create a problem for huge volumes of data within a NoT.
- 9. Some NoTs may not have an aggregator, e.g., a single light sensor will send a signal directly to a smart lightbulb to turn it off or on.
- 10. Security is a concern for aggregators (malware or general defects) and for the sensitivity of their aggregated data. Further, aggregators could be attacked, e.g., by denying them the ability to operate/execute or by feeding them bogus data.
- 11. Reliability is a concern for aggregators (general defects).
Overview (Internet)[]
“ | Online aggregators recreate the functions of a publication like Reader’s Digest, collecting and arranging materials produced by others. But online, where the editors may be any combination of human staff, computer algorithms, and the audience itself, aggregators can publish an infinite number of virtual publications or programs — a new one for every audience member, every few minutes.[2] | ” |
References[]
Source[]
- "Overview (Data)" section: NIST Special Publication 800-183, at 4-5.
See also[]
This page uses Creative Commons Licensed content from Wikipedia (view authors). |