Forthcoming articles

International Journal of Computational Economics and Econometrics

International Journal of Computational Economics and Econometrics (IJCEE)

These articles have been peer-reviewed and accepted for publication but are pending final changes, are not yet published and may not appear here in their final order of publication until they are assigned to issues. Therefore, the content conforms to our standards but the presentation (e.g. typesetting and proof-reading) is not necessarily up to the Inderscience standard. Additionally, titles, authors, abstracts and keywords may change before publication. Articles will not be published until the final proofs are validated by their authors.

Forthcoming articles must be purchased for the purposes of research, teaching and private study only. These articles can be cited using the expression "in press". For example: Smith, J. (in press). Article Title. Journal Title.

Articles marked with this shopping trolley icon are available for purchase - click on the icon to send an email request to purchase.

Register for our alerting service, which notifies you by email when new issues are published online.

Open AccessArticles marked with this Open Access icon are freely available and openly accessible to all without any restriction except the ones stated in their respective CC licenses.
We also offer which provide timely updates of tables of contents, newly published articles and calls for papers.

International Journal of Computational Economics and Econometrics (33 papers in press)

Regular Issues

  • Research Note: Futures Hedging with Stochastic Volatility: A New Method   Order a copy of this article
    by Moawia Alghalith, Christos Floros 
    Abstract: The aim of this paper is to present a continuous-time dynamic model of futures hedging. In particular, we extend the theoretical and empirical literature (e.g. Alghalith, 2016; Alghalith et al., 2015; and Corsi et al., 2008) in several important ways. First, we present a theory-based model. A significant empirical contribution is that we do not need data for the basis risk or the spot price. To the best of our knowledge, this is the first paper to assume that the volatility of futures price is stochastic and thus to estimate the volatility of volatility of futures price. Using daily futures data from the S&P500 index, we calculate an average daily volatility as well as the volatility of volatility of futures prices. We recommend that the managers of the futures market should report the stochastic volatility of the futures price (and its volatility), in addition to the traditional volatility.
    Keywords: stochastic volatility; volatility of volatility; futures; hedging.

  • Stein-Rule Estimation in Genetic Carrier Testing   Order a copy of this article
    by Tong Zeng, Carter Hill 
    Abstract: In this paper, we apply the fully correlated random parameters logit (FCRPL) model to the genetic carrier testing data using shrinkage estimation. We show that shrinkage estimates with higher shrinkage constant improve the percentages of correct predicted choices by 2% and 10% respectively with Jewish and general population samples. The mean estimates of elasticity based on the shrinkage estimates are closer to those with the FCRPL model estimates and have smaller standard errors than the corresponding results based on the uncorrelated random parameters logit model estimates.
    Keywords: pretest estimator; positive-part Stein-like estimator; likelihood ratio test; random parameters logit model.
    DOI: 10.1504/IJCEE.2019.10016587
     
  • Efficiency in Banking: Does the Choice of Inputs and Outputs Matter?   Order a copy of this article
    by Christos Floros, Constantin Zopounidis, Yong Tan, Christos Lemonakis, Alexandros Garefalakis, Efthalia Tabouratzi 
    Abstract: This paper examines banking efficiency using recent data from PIGS countries (i.e.: Portugal, Italy, Greece and Spain) which suffer from debt problems. We employ a 2-stage approach based on the effect of several items of balance sheets on cash flows and DEA analysis. More specifically, we extend previous studies by giving attention to the deposit dilemma. The reported results show that the choice of inputs and outputs does matter in the case of European banking efficiency. Although the role of deposits is controversial, we find that deposits may be an output variable, due to liquidity issues that play a major role in the efficiency of PIGS banking sector. We also report that the DEA model with deposits as an output variable generates efficiency scores that fall between periods. These results are helpful to bank managers and financial analysts dealing with efficiency modelling.
    Keywords: PIGS; Banking sector; Efficiency; Deposits dilemma; 2-stage approach; Cash flows; DEA; regression.

  • Multi-period Mean-variance Portfolio Selection with Practical Constraints Using Heuristic Genetic Algorithms   Order a copy of this article
    by Yao-Tsung Chen, Hao-Qun Yang 
    Abstract: Since Markowitz proposed the meanvariance (MV) formulation in 1952, it has been used to configure various portfolio selection problems. However Markowitzs solution is only for a single period. Multi-period portfolio selection problems have been studied for a long time but most solutions depend on various forms of utility function, which are unfamiliar to general investors. Some works have formulated the problems as MV models and solved them analytically in closed form subject to certain assumptions. Unlike analytical solutions, genetic algorithms (GA) are more flexible because they can solve problems without restrictive assumptions. The purpose of this paper is to formulate multi-period portfolio selection problems as MV models and solve them by GA. To illustrate the generality of our algorithm, we implement a program by Microsoft Visual Studio to solve a multi-period portfolio selection problem for which there exists no general analytical solution.
    Keywords: Multi-period portfolio selection; Mean-variance formulation; Genetic algorithm; Transaction costs.

  • Using singular spectrum analysis for inference on seasonal time series with seasonal unit roots   Order a copy of this article
    by Dimitrios Thomalos, Hossein Hassani 
    Abstract: The problem of optimal linear filtering, smoothing and trend extraction for m-period differences of processes with a unit root is studied. Such processes arise naturally in economics and finance, in the form of rates of change (price inflation, economic growth, financial returns) and finding an appropriate smoother is thus of immediate practical interest. The filter and resulting smoother are based on the methodology of Singular Spectrum Analysis (SSA). An explicit representation for the asymptotic decomposition of the covariance matrix is obtained. The structure of the impulse and frequency response functions indicates that the optimal filter has a permanent" and a transitory component", with the corresponding smoother being the sum of two such components. Moreover, a particular form for the extrapolation coefficients that can be used in out-of-sample prediction is proposed. In addition, an explicit representation for the filtering weights in the context of SSA for an arbitrary covariance matrix is derived. This result allows one to examine the specific effects of smoothing in any situation. The theoretical results are illustrated using different data sets, namely U.S. inflation and real GDP growth.
    Keywords: Core inflation; Business cycles; Differences; Euro; Linear filtering; Trend extraction and prediction; Unit root.

  • Bias decomposition in the Value at Risk calculation by a GARCH(1,1)   Order a copy of this article
    by Gholam Reza K. Haddad 
    Abstract: The recent researches show that Value at Risk estimations are biased and is calculated conservatively. Bao and Ullah (2004) proved the bias of an ARCH(1) model for VaR can be decomposed in to two parts: bias due to return misspecification distributional assumption for GARCH(1,1) (Bias1) and bias due to estimation error (Bias2). Using quasi maximum likelihood estimation method this paper intends to find an analytical framework for the two source of biases. We generate returns from Normal and t-student distributions, then estimate the GARCH(1,1) under Normal and t-student assumptions. Our findings reveal that Bias1 equals to zero for the Normal likelihood function, but Bias2 0. Also, Bias1 and Bias2 are not zero for the t-student likelihood function as analytically were expected. However all the biases become modest, when the number of observations and degree of freedom is large.
    Keywords: Value-at-Risk; GARCH(1,1); Second-order bias.

  • Performance Evaluation of the Bayesian and classical Value at Risk models with circuit breakers set up   Order a copy of this article
    by Gholam Reza K. Haddad, Hadi Heidari 
    Abstract: Circuit breakers, like price limits and trading suspensions, are used to reduce price volatility in security markets. When returns hit price limits or missed, observed returns deviate from equilibrium returns. This creates a challenge for predicting stock returns and modeling Value at Risk (VaR). In Tehran Stock Exchange (TSE), the circuit breakers are applied to control for the excess price volatilities. We extend Weis (2002) model, in the framework of Bayesian Censored and Missing-GARCH approach, to estimate VaR for Iran Khodro Company (IKCO) share in TSE. Using daily data for the period of June 2006 to June 2016, we show that the Censored and missing- GARCH model with t-student distribution outperforms. Kullback-Leibler (KLIC), Kupic (1995) test and Lopez score (1998) outcomes show that estimated VaR by Censored and missing- GARCH model with t-student distribution is of the most accuracy among all other classical and Bayesian estimation models.
    Keywords: Circuit Breakers; Censored and Missing–GARCH; Bayesian estimation; Value at Risk; Ranking Models.

  • Stages and determinants of European Union Small and Medium Sized firms failure process   Order a copy of this article
    by Alexios Makropoulos, Charlie Weir, Xin Zhang 
    Abstract: This paper uses a combination of Factor and Cluster analysis to identify and compare failure processes in Small and Medium sized firms from a number of European Union countries. Panel data analysis is then used to identify the determinants of the firms transition from financial health towards liquidation in the alternative failure processes. The results suggest that there are 4 different firm failure processes. We find that financial performance and director characteristics differ between firm failure processes. We also find that the economic environment, the legal tradition of countries and excessive firm growth are determinants of the transition of firms towards liquidation across most firm failure processes. These findings may be of practical use to policy makers, lenders and risk managers who will benefit from a better understanding of the differences between the alternative firm failure processes and from the determinants of a firms transition towards liquidation within these failure processes.
    Keywords: SME failure; firm failure; firm failure process; factor/cluster analysis; ordered random effects regression; failure status transition.

  • Abnormal returns and systemic risk: evidence from a non-parametric bootstrap framework during the European sovereign debt crisis.   Order a copy of this article
    by Konstantinos Gkillas, Christos Floros, Christoforos Konstantatos, Dimitrios I. Vortelinos 
    Abstract: We investigate the impact of European Central Bank (ECB) interventions on major European and Turkish stock and credit default swap (CDS) markets highlighting the importance of abnormal to excess abnormal returns in the systemic risk. In particular, we examine the impact of ECB announcements (news) on major European and Turkish financial markets (stocks and CDSs indices) for a high and low-volatility period, i.e. from November 6th, 2008 to December 31st, 2015. We also examine the market efficiency by using both an event study methodology and the Capital Asset Pricing Model. Moreover, the impact of the ECB events is measured by an event study and a systemic risk analysis. The results show that investors exposed to Finland, Sweden, Austria and Spain tend to be more vulnerable to risk and volatility, when ECB announcements are published.
    Keywords: abnormal returns; bootstrap; ECB events; financial crises;.

  • An analysis of major Moroccan domestic sectors interdependencies and volatility spillovers using Multivariate GARCH models.   Order a copy of this article
    by Ouael EL JEBARI, Abdelati HAKMAOUI 
    Abstract: This paper tries to give a thorough analysis of the mechanisms of volatility spillovers, as well as, a study of the time-varying interdependencies of volatilities of seven major sectors of the Moroccan stock exchange by proposing an empirical approach based on multivariate GARCH models. It uses daily data spanning the period between 02/07/2007 and 15/12/2016, covering seven principal sectors indices. The results of the study confirm the existence of multiple volatility transmissions in both ways and of both signs between sectors of our sample, along with, the quasi-abundance of positive correlations suggesting possible contagion effects. More importantly, our findings are in line with those discovered in the U.S financial market. The notoriety of this article resides in the fact that it broadens previously documented studies focusing mainly on external shocks by providing a study of internal shocks while applying two multivariate GARCH models.
    Keywords: Volatility spillover; dynamic conditional correlations; interdependencies; domestic sectors; multivariate GARCH models.

  • A note on the use of the Box-Cox Transformation for Financial Data   Order a copy of this article
    by Dimitrios Kartsonakis Mademlis, Nikolaos Dritsakis 
    Abstract: This paper tests whether the Box-Cox transformation reduces the problem of non-normality in financial data.
    Keywords: ARIMA models; Box-Cox transformation; Box-Jenkins methodology; normality; stock market; oil prices.
    DOI: 10.1504/IJCEE.2020.10024440
     
  • INFRASTRUCTURE DEVELOPMENT AND INCOME INEQUALITY IN INDIA: AN EMPIRICAL INVESTIGATION   Order a copy of this article
    by Varun Chotia 
    Abstract: The purpose of this paper is to investigate the relationship between infrastructure development and income inequality in India for the time period of 1991 to 2012. Firstly, we represent infrastructure development by making use of Principal Component analysis technique to construct an overall index which is based on four major sub sectors falling under overall Infrastructure sector (Transport, Water and Sanitation, Telecommunications and Energy). Further, we empirically investigate the relationship between infrastructure development and income inequality by using the Auto Regressive Distributed Lag (ARDL) bound testing approach. The stationary properties of the variables are checked by ADF, DF-GLS and KPSS unit root tests. The co-integration test confirms the presence of a long run relationship between Infrastructure development and income inequality for India. The ARDL test results indicate that Infrastructure development does not help in reducing income inequality. Both inflation and economic growth amplify the income inequality both in the long run as well as the short run whereas trade openness comes out to be the indicator which is able to decrease the gap between rich and poor in India. The study calls for adopting economic policies and reforms which are aimed at developing and strengthing the Infrastructure levels, bringing in more investment in the sector in order to achieve the much required inclusive growth, and ultimately reduce the income inequality currently prevailing in India. Till date, there have been very few studies that have made an attempt to examine the relationship between Infrastructure development and income inequality nexus, by including gini coefficient as a proxy for inequality for India and applying ARDL approach to investigate the short run and long run dynamics. Hence, the contribution of this paper is to fill these research gaps.
    Keywords: Infrastructure development; Income inequality; Co-integration; Auto regressive distributed lag (ARDL).

  • The Role of R&D in Economic Growth in Arab Countries   Order a copy of this article
    by Mohammed Shahateet 
    Abstract: This paper explores the impact of research and development (R&D) activities on economic growth in 12 Arab countries of the Middle East. We have conducted different pre-estimation tests in order to select the appropriate econometric model, including cross-sectional dependence, stationarity, causality, and cointegration. We perform post-estimation diagnostics to test for both long-run and short-run relationship by applying a panel Auto Regressive Distributed Lag (ARDL) model using data for the period 1996-2016. Long-run analysis confirms that R&D activities positively affect economic growth while in the short-run this relationship is insignificant.
    Keywords: Economic growth;R&D; ARDL model; Panel data; Arab countries.

  • Economic and Business Cycle with Time Varying in India: Evidence from ICT Sector   Order a copy of this article
    by Chukiat Chaiboonsri 
    Abstract: Combining the theoretical concept of the Real Business Cycle (RBC) and computational econometric modeling for ICTs systems, the purpose of this paper is divided into two main sections. The first part is to study the relationship between Indian ICT industries and GDP by applying Bayesian inference, now referring to the modern statistics in this era. The section of data classifications, five-yearly predominant indexes collected during 2000 to 2015, including Indian GDP, fixed phone usages, mobile phone distributions, Internet servers, and broadband suppliers are analyzed by employing the Markov-Switching model (MS-model) and Bayesian Vector Autoregressive models (BVAR). The second section is the application of time-varying parameter VAR model with stochastic volatility (TVP-VAR). Based on Bayes statistics, this time-varying analysis can more clearly provide the extended perception regarding nature underlying structure in the economy in a flexible and robustness. In addition, the Bayesian regression model is used to investigate the ICT multiplier related to Indian economic growth. The empirical results indicate that IT sectors are becoming the major role of Indian economic expansion in the forthcoming future, compared with telecommunication sectors. Moreover, the result of the ICT multiplier confirms that high technological industrial zones should be systematically enhanced continuously, in particular, research and development in cyberspace. This additionally confirms that the high-tech industries are playing an important role to raise the levels of employment in India and lead it up to erase poverty in social-economic levels.
    Keywords: Information and Communication Technology (ICT); Bayesian Inference; Markov-Switching Model (MS-model); Bayesian Vector Autoregressive model (BVAR); Time-Varying Parameter VAR (TVP_VAR).

  • Determinants of risk sharing via exports: Trade openness and Specialization   Order a copy of this article
    by Faruk Balli, Eleonora Pierucci, Jian Gan 
    Abstract: Economic theory predicts that one of the main benefits of financial globalisation is the improvement of international risk sharing. In this paper, we provide an empirical evaluation of the determinants of risk sharing via exports. We conclude that risk sharing via exports is somehow important in emerging countries but not among OECD countries. More importantly, we find that trade openness and production/export specialisation generally have, with some exceptions, positive and statistically significant relationship with risk sharing. On the contrary, concentration on export destinations has been proved to be negatively correlated with risk sharing.
    Keywords: risk sharing; production specialisation; export specialisation; trade openness; financial globalisation.
    DOI: 10.1504/IJCEE.2020.10023093
     
  • Value-added in high technology and industrial basic research: a weighted network observing the trade of high-tech goods   Order a copy of this article
    by Antonio Zinilli, Mario De Marchi 
    Abstract: Expenditure on research and development (R&D) is a key indicator of the innovative efforts of countries. In this paper, we want to examine through a new approach the relationship between basic research and economic benefits. Although we are aware that the topic has already been extensively addressed, this paper differs from the previous literature because it focuses on value added trade instead of gross trade flows. We use an Exponential Random Graph Model for weighted networks to study the impact of private investment in basic research on value-added of exports in high technology, namely the domestic value added absorbed abroad. We want to understand if this measure (without intermediate imports) is able to confirm the results of the previous literature, which used gross flows. Our results show that private investment in basic research has a positive influence on exports, giving a competitive advantage in international trade.
    Keywords: Exponential random graph model; Weighted Networks; Industrial Basic Research; Trade in Value Added.

  • Persistent dynamics in (in)determinate equilibrium rational expectations models   Order a copy of this article
    by Marco Maria Sorge 
    Abstract: Equilibrium indeterminacy in rational expectations models is often claimed to produce higher time series persistence relative to determinacy. Proceeding by means of a simple linear stochastic model, I formally show that, for reasonable parameter configurations, there exists an uncountable (continuously infinite) set of indeterminate equilibria in low-order AR(MA) representation, which exhibit strictly lower persistence than their determinate counterpart. Implications for empirical studies concerned with e.g. testing for indeterminacy and macroeconomic forecasting are discussed.
    Keywords: Rational expectations; Indeterminacy; Persistence.

  • Size distribution analysis in the study of urban systems: evidence from Greece   Order a copy of this article
    by Dimitrios Tsiotas, Labros SDROLIAS, Georgios ASPRIDIS, Dagmar SKODOVA-PARMOVA, Zuzana DVORAKOVA-LISKOVA 
    Abstract: This paper examines empirically the utility of size-distribution analysis in the study of urban systems, on data referring to every urban settlement recorded in 2011 national census of Greece. The study lowers the scale of the size-distribution analysis to the regional, instead of the national level, where is commonly being applied, examining two aspects of size-distributions, the rank-size and the city-size-distribution, in comparison with three well-established statistical dispersion indices, the coefficient of variation, the Theil index and the Gini coefficient. The major research question is to detect how capable are the size-distribution exponents to operate as measures of statistical dispersion and to include socioeconomic information. Overall, the analysis concludes that the size-distribution assessment is useful for the initialization of the study of urban systems, where the available information is restricted to population size, and is capable to provide structural information of an urban system and its socioeconomic framework, but not more effective than other measures of statistical dispersion.
    Keywords: power law; Zipf’s law; Regional Economics; cities; Regional Science; Econophysics.

  • Factor decomposition of disaggregate inflation: the case of Greece   Order a copy of this article
    by Nikolaos Krimpas, Paraskevi Salamaliki, Ioannis Venetis 
    Abstract: We use static and dynamic factor models to decompose Greek inflation into common components. Static factor analysis suggests the need to develop comprehensive underlying inflation measures for Greece. Dynamic factor analysis decomposes inflation into three components: pure inflation and relative price inflation both driven by aggregate shocks and, an idiosyncratic component reflecting sector specific shocks. We verify the idiosyncratic component as the main source of inflation variability while pure inflation and its associated shocks are dominant compared to relative inflation. Based on pure inflation correlations, the relative weight of anticipated monetary shocks is large only for the spread between the 10-year government bond yield and a three-month short run rate and only in times of monetary stability.
    Keywords: Disaggregate CPI; Dynamic Factor Model; Pure inflation; Relative prices.

  • Computational method for approximating the behaviour of a triopoly: an application to the mobile telecommunications sector in Greece   Order a copy of this article
    by Yiannis Bassiakos, Zacharoula Kalogiratou, Theodoros Monovasilis, Nicholas Tsounis 
    Abstract: Computational biology models of the Volterra-Lotka family, known as competing species models, are used for modelling a triopoly market, with application to the mobile telecommunications in Greece. Using a data sample for 1999-2016, parameter estimation with non-linear least squares is performed. The findings show that the proportional change in the market share of the two largest companies, Cosmote and Vodafone, depends negatively on the market share of each other. Further, the market share of the marker leader, Cosmote, depends positively on the market share of the smallest company, Wind. The proportional change in the market share of Wind, depends negatively on the market share of the largest company Cosmote but it depends positively by the change in the market share by the second company, Vodafone. In the long-run it was found that the market shares tend to the stable equilibrium point where all three companies will survive with Cosmote having a projected number after eleven years (in 2030) of approximately 7.3 million subscribers, Vodafone 4,9 and Wind 3.7, the total number of projected market size being approximately 16 million customers.
    Keywords: Volterra-Lotka models; triopoly; mobile telecommunications sector.

  • Separating Yolk from White: A Filter based on Economic Properties of Trend and Cycle   Order a copy of this article
    by Peng Zhou 
    Abstract: This paper proposes a new filter technique to separate trend and cycle based on stylised economic properties of trend and cycle, rather than relying on ad hoc statistical properties such as frequency. Given the theoretical separation between economic growth and business cycle literature, it is necessary to make the measures of trend and cycle match what the respective theories intend to explain. The proposed filter is applied to the long macroeconomic data collected by the Bank of England (1700-2015).
    Keywords: Filter; Trend; Cycle.

  • Efficiency of microfinance institutions of South Asia: A bootstrap DEA approach   Order a copy of this article
    by Asif Khan, Rachita Gulati 
    Abstract: The MFIs are special types of institutions operate with the dual goals; financial sustainability and social outreach. Therefore, the present paper aims to assess the efficiency levels on the attainment of dual mission of financial sustainability and social outreach of MFIs operating in the selected four South Asian countries (i.e., Bangladesh, India, Nepal and Pakistan) during the financial year 2010 to 2015. The conventional data envelopment analysis (DEA) models do not follow the statistical properties, consequently, may produce the biased efficiency estimates. Therefore, we incorporated the homogeneous bootstrap procedure in DEA framework suggested by Simar and Wilson (1998, 2000) to estimate bias-corrected efficiency scores of individual MFIs. In addition, we designed two separate DEA models to assess the MFIs performance from both the prospects; financial and social, simultaneously. We first detected and removed the outliers from the dataset by using the procedure suggested by Banker and Gifford (1988) based on the super-efficiency concept and then proceed further. The empirical results confirmed that on an average, the South Asian MFIs remained more financially efficient than socially during the study period. However, financial efficiency has decreased over time while the social efficiency has improved slightly. Further, among the peer nations, Indian MFIs outperform in terms of both financial and social aspects followed by Nepali and Bangladeshi MFIs, respectively. However, the Pakistani MFIs were found to be the least performers in both social outreach and financial sustainability.
    Keywords: Bias-corrected efficiency; financial efficiency; social efficiency; bootstrap data envelopment analysis; DEA; bootstrap DEA; microfinance institutions; MFIs; microfinance; South Asia.

  • Bootstrapping the Log-periodogram Estimator of the Long-Memory Parameter: is it Worth Weighting?
    by Saeed Heravi, Kerry Patterson 
    Abstract: Estimation of the long-memory parameter from the log-periodogram (LP) regression, due to Geweke and Porter-Hudak (GPH), is a simple and frequently used method of semi-parametric estimation. However, the simple LP estimator suffers from a finite sample bias that increases with the dependency in the short-run component of the spectral density. In a modification of the GPH estimator, Andrews and Guggenberger, AG (2003) suggested a bias-reduced estimator, but this comes at the cost of inflating the variance. To avoid variance inflation, Guggenberger and Sun (2004, 2006) suggested a weighted LP (WLP) estimator using bands of frequencies, which potentially improves upon the simple LP estimator. In all cases a key parameter in these methods is the need to choose a frequency bandwidth, m, which confines the chosen frequencies to be in the ‘neighbourhood’ of zero. GPH suggested a ‘square-root’ rule of thumb that has been widely used, but has no optimality characteristics. An alternative, due to Hurvich and Deo (1999), is to derive the root mean square error (rmse) optimising value of m, which depends upon an unknown parameter, although that can be consistently estimated to make the method feasible. More recently, Arteche and Orbe (2009a,b), in the context of the GPH estimator, suggested a promising bootstrap method, based on the frequency domain, to obtain the rmse value of m that avoids estimating the unknown parameter. We extend this bootstrap method to the AG and WLP estimators and to consideration of bootstrapping in the frequency domain (FD) and the time domain (TD) and, in each case, to ‘blind’ and ‘local’ versions. We undertake a comparative simulation analysis of these methods for relative performance on the dimensions of bias, rmse, confidence interval width and fidelity.
    Keywords: Long memory; bootstrap; log-periodogram regression; variance inflation; weighted LP regression; time domain; frequency domain.

Special Issue on: ICOAE2017 Applied Economics

  • Real Options Games Between Two Competitors: The Case of Price War   Order a copy of this article
    by Elżbieta Rychłowska-Musiał 
    Abstract: This paper takes the subject of optimal investment strategies for firms acting on a competitive market. An investment decision-making process is described as a game between two players, and the real options approach is used to find a value of an investment project, therefore the paper falls in the area of the real options games. We discuss different games which can occur between competitors and formulate recommendations for them. It comes as no surprise that the advantage is primarily on the side of a dominant company, but under certain circumstances, a weaker party has a very strong bargaining chip. Competition between firms can take the form of price war. To mitigate possible effects of price war, firms may cooperate and their negotiations could be supported by a payoff transfer computed as the coco value. It also turned out that the possible cooperation between competitors gains significance when a project risk is high, as well as when the price war is cut-throat.
    Keywords: real options; investment option; real options games; competition; price wars; bargaining game; cooperative-competitive value; coco value.

  • Modelling Agricultural Risk in a Large Scale Positive Mathematical Programming Model   Order a copy of this article
    by Ivan Arribas, Kamel Louhichi, Angel Perni, Jose Vila, Sergio Gomez-y-Paloma 
    Abstract: Mathematical programming has been extensively used to account for risk in farmers' decision making. The recent development of the Positive Mathematical Programming (PMP) has renewed the need to incorporate risk in a more robust and flexible way. Most of the existing PMP-risk models have been tested at farm-type level and for a very limited sample of farms. This paper presents and tests a novel methodology for modelling risk at individual farm level in a large scale model, called IFM-CAP (Individual Farm Model for Common Agricultural Policy analysis). Results show a clear trade-off between including and excluding the risk specification. Albeit both alternatives provide very close estimates, simulation results shows that the explicit inclusion of risk in the model allows isolating risk effects on farmer behaviour. However, this specification increases three times the computation time required for estimation.
    Keywords: agriculture; positive mathematical programming; risk and uncertainty; expected utility; Highest Posterior Density; European Common Agricultural Policy.

  • Overvaluation in a non-optimal currency area   Order a copy of this article
    by Carlos Encinas-Ferrer 
    Abstract: The devaluation tool in an optimal currency area with monetary sovereignty has a significant importance in determining economic policies to adjust relative costs and interest rates to the situation faced by a country in front of economic shocks, either asymmetric or generalized. Devaluation risk is due not only to domestic inflation but to its relationship with that of its major trading partners. If inflation of a nation is greater than the average one of its trading partners and the gap between them is not adjusted by the depreciation of its currency, it will start a process of overvaluation. This overvaluation ends manifesting itself by a growing lack of competitiveness in its foreign trade showed by trade deficit, reduced gross domestic product (GDP) and rising unemployment. Devaluation or depreciation would restore the competitiveness of the productive apparatus. However, in a non-optimal currency area -as a country unilaterally dollarized- this adjustment may be made by abandoning the anchor coin and adopting a new national currency, what it has been called remonetization (Encinas-Ferrer 2003-1 and 2) but the Eurozone experience from 2011 shows us that abandoning a non- optimal currency area and stablishing a new national currency is a very difficult decision that no one has dared to take until now.
    Keywords: overvaluation; optimal currency areas; non-optimal currency areas; euro-zone.

  • An Analysis of Long-Run Relationship between ICT Sectors and Economic Growth: Evidence from ASEAN Countries   Order a copy of this article
    by Chukiat Chaiboonsri, Satawat Wannapan 
    Abstract: This paper is proposed to investigate the causal panel relationship between information and communication technology (ICTs) segments and economic expansionary rates in ASEAN countries. Methodologically, the panel time-series data observed during 2006 to 2016 is employed to estimate the panel Granger Causality test. According to the technical problem of lag selection for the panel causal analysis, the computationally statistical approach called Newtons optimization method is helpfully applied to verify the suitable lag selection. The empirical results found that ICTs are not the major factor that causally motivates economic growth in ASEAN. This is confirmed by the extended section of the Autoregressive Distributed Lag (ARDL) cointegration approach, which is based on Bayesian statistics combining with the simulation method called Markov Chain Monte Carlo (MCMC). The results state Thailand is the only one among eight selected countries in ASEAN contained the long-run relationship between ICTs and GDP. This can be strongly concluded that the ICT sectors are not sustainable for driving economic growth in ASEAN. To address the issue, equitable educational systems and advanced infrastructural developments are the primary that should be corporately implemented.
    Keywords: ICT segments ;economic growth ;long-run relationship ;ASEAN countries;Bayesian approach.

  • Evidence for the Globalization Types Model Integrating Different Trade Theories   Order a copy of this article
    by Bruno G. Rüttimann 
    Abstract: This paper summarizes the research work performed during the last ten years regarding the globalization phenomenon measuring and analyzing the evolution of trade globalization of the period 2003-2015. The goal was to find evidence for a new Globalization Types Model. Indeed, the economic system has become more complex during recent years, the current trade models not being able to capture individually all the different aspects as well as not being universally applicable. The evolution of globalization has been measured with a new entropy-based metric that computes the interweavement of trade flows. The research has shown that economic world trade, as a whole, has been globalizing during recent years but with different patterns: de-globalizing for advanced economic regions and globalizing for emerging economic regions. These differences can be explained with the Globalization Types Model by integrating neoclassic trade theories. Furthermore, the aggregated result seems to confirm inverse Kuznets evolution of globalization. The analysis has been leading to enounce seven Trade Globalization Postulates explaining the phenomenon of trade globalization evolution.
    Keywords: Globalization types; globalization forms; globalization models; inequality metric; trade flows; foreign trade; trade theory; Kuznets.

Special Issue on: MIC 2017 Managing the Global Economy

  • Stabilization Policies in a Small Euro Area Economy: Taxes or Expenditures? A Case Study for Slovenia   Order a copy of this article
    by Klaus Weyerstrass, Reinhard Neck, Dmitri Blueschke, Boris Majcen, Andrej Srakar, Miroslav Verbič 
    Abstract: In this paper we investigate how effective stabilization policies can be in a small open economy which is part of the Euro Area, namely Slovenia. In particular, we analyse whether tax policy or expenditure policy has stronger multiplier effects. Slovenia is an interesting case because it is a small open economy in Central Europe that was already in the Euro Area before the Great Recession. Using the SLOPOL10 model, an econometric model of the Slovenian economy, we analyse the effectiveness of some categories of taxes and public spending. Some of these instruments are targeted towards the demand side, while others primarily influence the supply side. Our results show that those public spending measures that entail both demand and supply side effects are more effective at stimulating real GDP than pure demand side measures. Measures that improve the education level of the labour force are very effective at stimulating potential GDP. Employment can be most effectively stimulated by cutting the income tax rate and the social security contribution rate, i.e. by reducing the tax wedge on labour income, thereby positively affecting Slovenias international competitiveness. On the other hand, simulations show that fiscal policy measures can only mitigate but not undo the adverse effects of a crisis like the Great Recession.
    Keywords: Macroeconomics; stabilization policy; fiscal policy; tax policy; public expenditures; Slovenia; public debt; econometric model; simulation.

Special Issue on: Machine Learning, Artificial Intelligence, and Big Data Methods and New Perspectives

  • A SAS Macro for Examining Stationarity Under the Presence of Endogenous Structural Breaks   Order a copy of this article
    by Dimitrios Dadakas, Scott Fargher 
    Abstract: The endogenous structural break literature present numerous computationally intensive procedures for the examination of stationarity under the presence of single or multiple structural breaks. Application of these grid-search procedures is rather complicated and not many researchers have access to code that can easily be applied. In this article, we present a SAS macro, that allows the examination of stationarity under the assumption of either one or two, endogenously determined, structural breaks using the Zivot and Andrews (1992) and the Lumsdaine and Pappel (1997) methodologies. We demonstrate the macro using the Nelson-Plosser (1982) data, that was also used by Zivot and Andrews (1992) and Lumsdaine and Pappel (1997), to highlight differences and similarities of the macro command with the original results published.
    Keywords: Endogenous Structural Breaks; Stationarity; Time Series; SAS; Macro.

  • Automated detection of entry and exit nodes in traffic networks of irregular shape   Order a copy of this article
    by Simon Plakolb, Christian Hofer, Georg Jäger, Manfred Füllsack 
    Abstract: We devise an algorithm that can automatically identify entry and exit nodes of an arbitrary traffic network. It is applicable even if the network is of irregular shape, which is the case for many cities. Additionally, the method can calculate the nodes attractiveness to commuters. This technique is then used to improve a traffic model, so that it is no longer dependent on expert knowledge and manual steps and can thus be used to analyse arbitrary traffic systems. Evaluation of the algorithm is performed twofold: The positions of the identified entry nodes are compared to existing traffic data. A more in-depth analysis uses the traffic model to simulate a city in two ways: Once with hand-picked entry nodes and once with automatically detected ones. The evaluation shows that the simulation yields a good match to the real world data, substantiating the claim that the algorithm can fully replace a manual identification process.
    Keywords: traffic modelling; network analysis; commuting; automated detection; entry nodes; exit nodes; traffic simulation; mobility behaviour; agent-based model; road usage; congestion.

  • Depth based support vector classifiers to detect data nests of rare events   Order a copy of this article
    by Rainer Dyckerhoff, Hartmut Jakob Stenz 
    Abstract: The aim of this project is to combine data depth with support vector machines (svm) for binary classification. To this end, we introduce data depth functions and svm and discuss why a combination of the two is assumed to work better in some cases than using svm alone. For two classes X and Y, one investigates whether an individual data point should be assigned to one of these classes. In this context, our focus lies on the detection of rare events, which are structured in data nests: Class X contains much more data points than class Y and Y has less dispersion than X. This form of classification problem is akin to finding the proverbial needle in a haystack. Data structures like these are important in churn prediction analyses which will serve as a motivation for possible applications. Beyond the analytical investigations, comprehensive Simulation studies will also be carried out.
    Keywords: Data depth; DD-Plot; Mahalanobis depth function; support vector machines,rnbinary classification; hybrid methods; rare events; data nest; churn prediction; big data.

  • Does time-frequency-scale analysis predict inflation? Evidence from Tunisia   Order a copy of this article
    by AMMOURI BILEL, Fakhri Issaoui, Habib ZITOUNA 
    Abstract: Forecasting macroeconomic indicators has always been an issue for economic policymakers. Different models are available in the literature; for example univariate and/or multivariate models, linear and/or non-linear models. This diversity requires a multiplicity of the used techniques.. They can be classified as pre and post-time series. However, this multiplicity allows the improvement of a better forecast of the macroeconomic indicators during unrest (be it political, economic, and/or social). In this paper, we deal with the problem of the performance of the macroeconomic models for predicting Tunisias inflation during instability following the 2011-revolution. To achieve this goal, the time-frequency-scale analysis (Fourier transform, Wavelet transform, and Stockwell transform) is used. In fact, we are interested in the ability of these techniques to improve predictor performances. We suggest the performance of the adopted approach (time-frequency-scale analysis). This performance is not quite absolute because their performance is less than multivariate model (Dynamic Factor Model) during economic instability.
    Keywords: Inflation forecast; uni-varied model; multi-varied model; time-frequency analysis; Fourier transform; wavelet transform; Stockwell transform.