The APISENSE® distributed platform12 [25, 24], developed by Lille 1, aims to deliver a crowd-sensing-as-a-service solution to collect metrics in the wild in order to analyze and visualize them in real-time. APISENSE® targets the acquisition of both quantitative and qualitative metrics from a wide diversity of connected devices (tablets, smartphones, TVs, set-top boxes) that access the Internet network through a large diversity of communication protocols and mediums (2G/3G/4G, WiFi, Bluetooth, ZigBee, Z-Wave, etc.). As part of this project, we will also work on incentive mechanisms to foster large adoption of the tools by the end-users. Thus, we will ensure a critical mass of key indicators to report and eventually alert on potential dysfunctions that could occur at any level of the infrastructure.
Datametrie GX, from ip-label, measures the quality perceived by all users of digital services. It relies on robots which measure the availability and performance of your applications in controlled measurement environments (active monitoring) in addition to measurements generated by your actual users (real-user monitoring). In the active monitoring, the Datametrie Robots surf proactively through critical paths with their applications, from browsers or portable devices, through external access such as carrier backbones or through private access. Those Robots alert of lack of availability or performance deterioration in the access of their services. The real user measurements analyze and monitor the changes in performance as they are experiencing in each connection to the applications from their own devices and accesses. This information will help you to ensure each usage without caring about the context used, allowing you to audit the usage of your applications under optimal conditions.
ACQUA is a framework and application developped at Inria, Diana team, for the modeling and estima- tion of Quality of Experience 15 [3, 8]. It is meant to predict Quality of Experience for the main applications the end-user runs. An application in ACQUA is a function/model that links the network-level measure- ments to the expected Quality of Experience for that application. Statistical machine learning techniques are used to establish such link between measurements both at the network level and device level, and esti- mations of the Quality of Experience for different Internet applications. The required data for such learning can be obtained either by controlled experiments as we did in a recent communication on Skype Quality of Experience , or by soliciting the crowd (i.e., crowdsourcing) for combinations of measurements and corresponding experience. The current version of ACQUA (available in Open Source under GPLv3 licence) provides basic measurements for access performance and a decision-tree model for the Quality of Experience of Skype. Within this project, the development of ACQUA will continue to transform it into a full tool for the integration of models for Quality of Experience Prediction and for the collection of end users’ feedback about their experience together with measurements about their network access and their devices. This latter feedback is of prime importance for the calibration of new models for Quality of Experience and for the validation of models developed by controlled experiments within the laboratory.
HostView is a tool developed at Inria, Muse team, and is one of the first application agnostic projects to combine the collection of network level performance metrics and explicit user feedback on networked applications’ performance. The initial version of HostView was developed for Linux and OS X and it collected network packet traces and application process executable names to infer network performance per application, as well as a number of end-host performance metrics such as WiFi signal strength/noise, and the device CPU load and memory utilization. In addition, HostView recorded user feedback on application performance with two mechanisms: a system-triggered questionnaire based upon the Experience Sampling Methodology (ESM), and a user-triggered mechanism called an “I’m annoyed” button. The ESM mechanism prompted the user no more than three times per day with a questionnaire about their experience with online applications. The “I’m annoyed” button was always displayed at the corner of the screen and we asked users to click on it when they are dissatisfied with their application performance. Both mechanisms triggered the same questionnaire which combined both quantitative and qualitative feedback: (i) rate your internet speed from 1 (slow) – 5 (very fast), (ii) identify any applications (from a list of those running) that they are unhappy with, (iii) indicate the problem (from a set of choices), and (iv) express any other additional information via a freeform text box. The initial HostView prototype was deployed on a small number (order of hundreds) of end hosts between November 2010 and April 2012. The experiment was announced in CS mailing lists and at the Internet Measurement Conference in 2010, so most users were computer scientists. As part of an European FP7 project (User-Centric Networking), we are working on an updated version of HostView that supports both Windows and Android. We plan for large-scale measurement campaign in the near future to collect a more comprehensive data set on network application QoE. Both the original HostView dataset and the new dataset will be available for our research inBottleNet.