1. On Spatial Dependence in Multivariate Singular Spectrum Analysis

    Richard Awichi


    In this paper, I present a method for utilizing the usually intrinsic spatial information in spatial data sets to improve the quality of temporal predictions within the framework of singular spectrum analysis (SSA) techniques. The SSA-based techniques constitute a model free approach to time series analysis and ordinarily, SSA can be applied to any time series with a notable structure. Indeed it has a wide area of application including social sciences, medical sciences, finance, environmental sciences, mathematics, dynamical systems and economics. SSA has two broad aims:

    1. i) To make a decomposition of the original series into a sum of a small number of independent and interpretable components such as a slowly varying trend, oscillatory components and a structure-less noise.
    2. ii) To reconstruct the decomposed series for further analysis in the absence of the noise component.

    Multivariate singular spectrum analysis (MSSA) is an extension of SSA to multivariate statistics and takes advantage of the delay procedure to obtain a similar formulation as SSA though with larger matrices for multivariate data. In situations where spatial data is an important focus of investigation, it is not uncommon to have attributes whose values change with space and time and an accurate prediction is thus important. The usual question asked is whether the intrinsic location parameters in spatial data can improve data analysis of such data sets. The proposed method is based on the inverse distance technique and is exemplified on climate data from Upper Austria for the period Jan 1994 to Dec 2009.

    Results show that the proposed technique of incorporating spatial dependence into MSSA analysis leads to improved quality of statistical inference.

    Keywords: time series analysis, MSSA, inverse distance weighting, spatial dependence.

    Pages: 1 – 12 | Full PDF Paper
  2. National Implementation of the GSBPM: The Egyptian Experience

    Ayman Hathoot

    Central Agency for Public Mobilization and Statistics (CAPMAS), Cairo, Egypt

    Abstract: The history of the Central Agency for Public Mobilization and Statistics in Egypt (CAPMAS) shows that each statistical product has its own production system. Standardized and clear metadata that explains each production phase and the work flow was not applied. A huge amount of important information about statistical products needed to produce modernized statistics was lost. That entailed to rearrange the metadata infrastructure to produce statistics according to fundamental principles of official statistics. Generic Statistical Business Process Model (GSBPM) has applied as an agreed and preferred reference model that has a flexible structure to maintain and document statistical data produced in a standardized way. The paper aims to demonstrate the experience of implementing GSBPM in CAPMAS through displaying the work effort to shift and establish that project with reference to obstacles that encountered implementation and solutions to overcome it, benefits achieved as a result of implementation, future plan, and conclusion.

    Keywords: Metadata, official statistics, standardization.

    Pages: 13 – 21 | Full PDF Paper
  3. SDMX, a Key Standard for Central Banks’ Statistics

    Bruno Tissot

    BIS, Basel, Switzerland

    Abstract: Having structured and well-documented statistics on the economy and the financial system is an essential objective for central banks. This is reinforced by the fact that they collect, compile, share and disseminate a wide range of data – even more so since the Great Financial Crisis (GFC) of 2007/09 and the significant data needs unveiled on this occasion. The Statistical Data and Metadata eXchange (SDMX) standard provides both an information model and a syntax that enables standardisation for the structure as well as the presentation of statistical datasets and supporting metadata. Triggered and supported by international organisations, this standard is now widely used by central banks to facilitate and streamline their statistical work, especially as regards the IT-related aspects. This is indeed a key conclusion of a related survey published by the Irving Fisher Committee on Central Bank Statistics (IFC) in 2016. Yet the SDMX standard needs to evolve further, not least to better deal with the handling of large micro data sets that are in high demand after the crisis.

    Keywords: central banking, great financial crisis, statistical system, micro data.

    Pages: 22 – 30 | Full PDF Paper
  4. Riemann Function and Relativistic Structure

    Wang Yiping

    Zhejiang Quzhou Association of Senior Scientists and Technicians Zhejiang Quzhou 324000.

    China • Qianjiang Mathematics and Power Engineering Institute Zhejiang Quzhou 324000.

    Abstract: It has pointed out that pointwise is a quantized unit concept of Riemann function, which features random “asymmetry, inhomogeneity and discontinuity”. It establishes the abstract relativistic structure of dimensionless quantities (called circular logarithm and super symmetric matrix unit) by application of the principle of relativity so as to ensure the “normativity and invariability” of every pointwise numerical value, location, property, topology and zero error, and achieve, with Riemann function, “topological variational rules without any specific content and accurate resolution that the number of elements and critical points (1/2) are all at the {1/2} Z straight line”. This computing method is simple, stable, self-consistent and pragmatic and features extensive applicability for multiple disciplines.

    Keywords: Riemann function, pointwise quantization, relativity structure (circular log), critical point of limit value

    Pages: 31 – 43 | Full PDF Paper