Creating and sharing knowledge for telecommunications

Talk - Must and May Abstractions for Markov Decision Processes


on 27-05-2011

... David Henriques (SQIG-IT).

May 27, 2011, Friday, 16h15m.

Abstract: Model checking has been extensively used to automatically check properties in large-scale systems. However, the classical techniques only allow for checking of qualitative properties, thus being inadequate for inherently probabilistic systems. Probabilistic model checking (PMC) extends the classical suite of techniques of classical model checking by allowing quantitative reasoning over probabilistic-nondeterministic systems which combine both probabilistic behaviour and nondeterministic choice. Abstraction, the main tool to deal with the ``state explosion'' problem in the classical setting, is not well developed in the probabilistic setting, limiting the applicability of PMC to little more than checking of reachability properties. Part of the reason for this limited success is that counterexamples in the probabilistic cases are much more complicated objects than their classical counterparts, making counterexample analysis much harder. In this seminar, we will present an abstraction-refinement loop for probabilistic systems based on an extension of the concept of may and must abstractions in the classical setting. May and must abstractions are respectively over and underestimations of the behaviours of a system. For a large class of properties, one of the abstractions preserves satisfaction and the other preserves non-satisfaction, eliminating the need for (expensive) counterexample analysis. Joint work with Anvesh Komuravelli and Edmund Clarke.

Room: 3.10, Mathematics
Support: SQIG/IT with support from FCT and FEDER, namely via the following projects:
* PTDC/MAT/68723/2006 KLog;
* POCI/MAT/55796/2004 QuantLog;
* POSC/EIA/55582/2004 Space-Time-Types More Information..

Workshop ''The Growing Demands on Connectivity and Information Processing in Radio Astronomy from VLBI to the KKA''


on 24-05-2011

... This workshop intends to review current R&D trends in radio-astronomical data analysis and their convergence with the FP7/FP8 ICT roadmaps. The topics will cover astronomy and space science applications and deal with the technologies being investigated in projects ranging from electronic Very Long Baseline Interferometry (e-VLBI) to the SKA (Square Kilometre Array). Future paradigms for information processing up to the Exabyte and Exaflop regime will be discussed in collaboration with major industrial partners.

The distributed sensor networks currently used in radio astronomy are generating ever larger amounts of digital data, posing increasing demands on processing, transport and storage facilities. Networked instruments such as the e-EVN already send much of their data in real-time via optical fibres, through national and international research networks. Networked infrastructure was critical for the establishment and success of recent telescopes such as ALMA, e-Merlin, LOFAR, e-VLA and e-EVN. The ongoing transition from 10 Gbps to 100+Gbps networking infrastructure will lay the foundation for the next generation instruments and drive much of the cost of infrastructure of these large-scale projects.

How to utilise the emerging ICT infrastructure will be of crucial importance for many SKA Pathfinders. The EC-funded NEXPReS programme (Novel EXplorations Pushing Robust e-VLBI Services) is investigating new signal transport and processing technologies. SKA Design Studies like the PrepSKA and the Aperture Array Verification Program (AAVP) will have a direct influence on the utilisation of network connectivity. The SKA, precursors like ASKAP, Meerkat, MWA and pathfinders like APERTIF and in general any Aperture Array components, will certainly pose additional challenges on connectivity, processing and storage, representing an increase of several orders of magnitude compared to current information processing scenarios and may lead to the usage and test of new technologies for Future Internet. This growing connectivity will condition the final computing stages and science exploitation. These aspects will depend on new protocols for data formatting and successors to the widespread IP technology may be needed to take full advantage of the expected performance increase on the physical transport side. More Information..