Explore our journals
Browse journals by subject
Research picks
- A calmer, karma, CARMA algorithmic chameleon
A novel algorithmic system that works subtly in the background to mutual benefit, and adapts quickly to local conditions, could be useful in data processing where noise terms can be replaced with useful estimates of their values.
In the unpredictable world of data-driven modelling, some algorithms charge through problems like rhinos, others blend in and adapt like chameleons. A new approach to a long-standing challenge in system identification, how to work with missing and noisy data, falls firmly into the latter camp and is discussed in the International Journal of Modelling, Identification and Control.
The method is designed for Controlled Autoregressive Moving Average (CARMA) models, mathematical structures widely used to capture and forecast the behaviour of dynamic systems in fields as diverse as control engineering, economics, and climate science.
These models work best when both input and output data are complete and reliable. In reality, such an ideal is rarely achieved, if ever. Network interruptions, faulty sensors, and environmental disruptions frequently leave gaps in the record, while background noise, often with patterns of its own, can distort what remains. Conventional algorithms might falter under such conditions, producing biased results or unstable models that bear little resemblance to the real system. The new research takes a niftier approach, quietly adjusting to the data landscape and turning potential setbacks into advantages.
Its ingenuity lies in the combination of three distinct techniques. An auxiliary model estimates the unmeasured components of the system, extracting useful signals from what would otherwise be statistical clutter, an almost karmic reversal of bad data into good. An interpolation method then fills in the missing inputs by inferring plausible values from surrounding measurements. Finally, the process is quickened using Nesterov Accelerated Gradient optimisation, a mathematically elegant way of anticipating the best next step rather than taking each one blindly.
Together, these steps form the Interpolation-based Nesterov Accelerated Gradient (INAG) algorithm, a system that not only produces more accurate parameter estimates but does so faster than comparable methods, even in the presence of "coloured noise", random fluctuations with structure and memory.
For engineers, better system identification means better control, whether in regulating industrial processes, stabilising power grids, or fine-tuning autonomous vehicles. For economists and climate scientists, it offers a way to make more reliable forecasts from incomplete or noisy data, potentially improving policy and planning.
Lu, H., Chen, J., Xu, F. and Mao, Y. (2025) 'The Nesterov accelerated gradient algorithm for CARMA models with lost input data based on interpolation method', Int. J. Modelling, Identification and Control, Vol. 46, No. 1, pp.38–46.
DOI: 10.1504/IJMIC.2025.147954 - Paper late, getting the news of the world
Research in the International Journal of Information and Communication Technology discusses a new framework that could be used to define how news and information are delivered. The approach could help us overcome some of the associated problems of media saturation and information overload. The approach allows for real-time, adaptive decisions about the flow of information in an age of media convergence as traditional journalism, digital platforms, and interactive technologies become increasingly intertwined.
The new approach side-steps the static, off-the-shelf distribution methods of conventional media and develops a system that can adjust output based on audience behaviour dynamically, tailoring content presented as user one's interests change.
The approach integrates multimodal perception, the ability to interpret data from multiple sources such as text, images, and user interaction patterns, with reinforcement learning, a branch of artificial intelligence in which systems learn by trial-and-error guided by feedback. This combination allows the system to detect and respond to the gradual change in what an individual finds engaging over time, a concept known in psychology interest drift.
The researchers explain that their system works on three interconnected levels. First, it uses temporal attention mechanisms, which monitor how an individual's focus changes over time, along with deep feature extraction to identify subtle behavioural patterns. This allows the system to anticipate shifts in audience engagement and adjust accordingly.
Second, it employs a hierarchical reinforcement learning architecture. In this design, complex decision-making is broken into layers, making it easier to balance competing objectives. Using a blend of deep learning methods and evolutionary algorithms, which mimic natural selection to find optimal solutions, the system maximises audience reach, ensures timely delivery, and minimises computational and network resource use.
The third layer introduces an adaptive regulation process using mathematical optimisation techniques. This component fine-tunes the balance between performance and resource consumption in real time, enabling the system to remain efficient even under fluctuating conditions.
The implications for journalism in areas where timeliness and accuracy are essential, such as public health, politics, and in emergencies and crises could be enormous. An adaptive, responsive delivery model should improve audience engagement and comprehension, the work suggests.
Zhang, Y., Liu, Y. and Guo, Z. (2025) 'Optimising news dissemination pathways in the media convergence era: an interactive digital media technology approach', Int. J. Information and Communication Technology, Vol. 26, No. 29, pp.110–126.
DOI: 10.1504/IJICT.2025.147882 - Seeing the wood for the trees
Forests play a far more important role in global climate regulation and ecosystem stability than previously understood, according to authors of an Industry Note in the International Journal of Agriculture Innovation, Technology and Globalisation.
Almost one third of the Earth’s land surface is forested, and we have recognised for many decades that forests act not only as sinks for atmospheric carbon dioxide, but they also act as components of the planet’s climate system, they affect temperatures, rainfall, and atmospheric composition through a complex mix of biological and physical processes. The Industry Note emphasise how we are increasing our understanding of these complex processes and finally acknowledging how non-linear, dynamic, and deeply interconnected they are.
The authors point out that while forests help mitigate climate change by absorbing carbon dioxide, when they are degraded or destroyed, they can accelerate warming. Beyond climate regulation, forests support biodiversity, purify water, protect soil, and provide resources and livelihoods to millions of people.
The article looks at the three main types of forest: tropical, temperate, and boreal, and considers the distinct climate and economic roles of each. Tropical forests, of the three, hold the largest share of biodiversity and sequestered carbon. However, they are the most threatened, largely due to deforestation and agricultural expansion. Temperate forests, common in mid-latitude regions, are valuable both as carbon sinks and for their timber production. Boreal forests, found in northern regions such as Canada and Russia, store immense amounts of carbon in soil and permafrost, making them important for long-term climate stability.
The Note points out that since 2000, global forest cover has declined by almost one eighth. We have lost an area about twice the size of Texas or more than the area of Germany, Italy, and Spain combined. In tropical regions, deforestation is mainly driven by land clearing for agriculture. In temperate and boreal areas, wildfires and large-scale logging are more common causes. Climate change further increases risks, making forests more vulnerable to droughts, pests, and fire. These pressures reduce the forests’ ability to absorb carbon and alter their structure and function, with consequences for biodiversity and environmental stability. The Industry Note emphasises that sustainable forest management is now a matter of urgency if we are not to see continued harms from the loss of this natural resource.
An, Z. and Chang, S. X. (2024) 'Industry note: Forests are key to climate change mitigation and sustainable development', Int. J. Agriculture Innovation, Technology and Globalisation, Vol.4, No.4 pp.423-428.
Industry Note - Rethinking risky routes for regulated materials
Research in the International Journal of Shipping and Transport Logistics has looked at the problems of transporting hazardous materials where accidental and deliberate threats might cause serious problems, particularly in terms of potential terrorist activity.
Risk models for hazardous materials, hazmat, transport have usually focused on accidents caused by mechanical failure, human error, or environmental conditions. These models have been used to decide which routes transportation, whether road, rail, or marine, should take. The aim being to minimise the risks to life and the environment associated with accidental spills, fires, or explosions. However, since the early 2000s, intentional attacks on hazmat-transport has been of growing concern. Such terrorist activity has the potential for mass casualties, widespread environmental harm, and severe economic disruption.
The method discussed in IJSTL takes us a step further on by acknowledging the ubiquitous threat of terrorism and incorporating worst-case scenarios into the risk assessment from the first stages of planning onwards. The strategy takes into account tactical factors such as whether vehicles travel alone or in convoys, the spacing between them, the types of roads used, how quickly vehicles can travel under attack, and the likely weapons or tactics an attacker might employ.
The researchers explain that the model can be built into a Geographic Information System (GIS), a mapping tool that allows complex layers of data, such as road conditions, population density, and known security risks, to be analysed together. This level of information integration should allow route planners to identify transport routes that strike a balance between minimising for both accident and terrorism, rather than simply accounting for only one.
The team undertook a case study in one particularly troubled part of the world and demonstrated that their analysis would lead to different routing decisions. Routes that were considered optimal under traditional accident-only models were often very different when terrorism risk was taken into account. The terrorism-aware model favoured wider, well-maintained, multi-lane roads that allow higher travel speeds, reducing the time vehicles spend in potentially dangerous areas.
Yilmaz, Z. and Verter, V. (2025) 'Simultaneous consideration of the accident and terror risks for hazardous materials transportation', Int. J. Shipping and Transport Logistics, Vol. 21, No. 1, pp.1–38.
DOI: 10.1504/IJSTL.2025.147886 - Overwork might mean the drugs won't work
Research in the International Journal of Human Factors and Ergonomics has examined outpatient prescribing and found that overworked pharmacists as well as systemic workplace flaws can lead to dispensing errors. According to the World Health Organization (WHO), such errors remain a major and preventable cause of patient harm worldwide.
The researchers conducted their study in a large outpatient medical centre. They combined situational analysis, in-depth interviews, and detailed surveys to build a human-centred picture of medication and prescribing safety. Instead of framing mistakes as individual failings, the study focused on the environmental, cognitive, and systemic pressures shaping how well pharmacists performed in their roles.
Pharmacists commonly reported greater workloads than other healthcare professionals, with high time pressure and prolonged periods of standing, bending, and repetitive motion being problems they raised. Indeed, half of pharmacists reported persistent musculoskeletal discomfort over the past year, most commonly in the neck, shoulders, and knees. Notably, these symptoms did not vary significantly with years of experience nor specific job role. This, the researchers suggest, means that physical strain is an inherent feature of the current work environment for pharmacists rather than a temporary issue.
The study identified several key risk factors for dispensing errors: tiredness, frequent interruptions during work and pharmaceutical names and packaging that are very similar. These factors reflect both human vulnerabilities, such as cognitive overload, and systemic weaknesses such as inefficient workflow and poor packaging design.
The WHO has warned that medication-related harm is one of the top preventable causes of injury and death, and if counted as a disease, it would rank among the leading causes of mortality. Dispensing errors are thus part of a wider global challenge. Many such mistakes might be avoided by better working conditions for pharmacists and improved systems from the manufacturer's production and packaging plant to the pharmacy shelves.
Su, K-W., Tsai, Y-T. and Feng, Q-K. (2025) 'The multidimensional determinants of outpatient pharmacy dispensing errors: a mixed approach', Int. J. Human Factors and Ergonomics, Vol. 12, No. 1, pp.20–40.
DOI: 10.1504/IJHFE.2025.147892 - Recycle, resell, reboot
As laptops, notebook computers, become outdated as operating systems and software advance, and the hardware needed to support their myriad features becomes more demanding. As such, there is a growing problem of electronic waste (e-waste) from this particular sector as consumers replace effectively redundant devices with new ones. Millions of devices are discarded annually, so finding effective strategies to manage their environmental and social impact has become a global priority.
Research in the International Journal of Operational Research has focused on remanufacturing, the process of disassembling used devices, restoring or replacing, and even upgrading components, and rebuilding them to near-new condition. This approach not only reduces waste and conserves raw materials but also offers wider benefits such as creating jobs and making technology more accessible to lower-income users, given that a refurbished laptop may well be a lot cheaper than the latest model, but should in many cases be perfectly capable of running those advanced operating systems and software.
One of the big challenges in remanufacturing is deciding what to do with individual laptop components at end-of-life. These components vary widely in quality, and their potential for reuse is far from straightforward. High-quality parts are typically worth the cost of repair, but lower-quality ones present more difficult economic decisions. Yet, discarding them outright leads to the loss of materials that could otherwise be salvaged.
There is a lot of uncertainty in the various processes. So, the team has developed a sophisticated decision-making framework using a multi-period nonlinear integer programming model. This is a mathematical model designed to determine, over several time periods, the most cost-effective and resource-efficient use for each component, whether through reuse, conditional repair, or disposal. The approach uses advanced approximation techniques, metaheuristics, essentially a sophisticated form of trial-and-error to reduce the computing resources needed for the assessment. Metaheuristics can test, virtually, possible solutions and identify high-quality outcomes without having to examine every possibility in full detail.
Two such trial-and-error algorithms were auditioned for the job: Discrete Particle Swarm Optimisation (DPSO) and Genetic Algorithm (GA). DPSO is inspired by the behaviour of birds, fish, or insect flocks, shoals, and swarms. GA is based on natural selection and allows improved solutions to evolve over successive generations. Both were integrated into a decision support tool built in Microsoft Excel using Visual Basic. This choice should enable usability for business practitioners without expertise in complex mathematics.
The team found that on smaller problems, the algorithms produced answers that were close to the mathematically proven optimal ones. For larger-scale cases, GA proved more reliable, while DPSO occasionally settled on less effective outcomes. An important finding from the tests was that repair costs are a major factor in profitability of remanufactured laptops. This highlights the need for systems that can respond flexibly to changing economic conditions.
Anandh, G., PrasannaVenkatesan, S., Goh, M. and Kushwaha, G.C. (2025) 'Optimising end-of-life laptop remanufacturing decisions using meta-heuristics', Int. J. Operational Research, Vol. 53, No. 4, pp.525-555.
DOI: 10.1504/IJOR.2025.147789 - Every stock you take, AI be watching you
Researchers have developed a new way to model how inventory behaves when both customer demand and supplier deliveries are unpredictable, and when missed sales cannot be recovered. The approach provides more accurate estimates than common industry rules-of-thumb, and so might help businesses avoid costly overstocking and damaging shortages.
The work, discussed in the International Journal of Integrated Supply Management, builds on the Economic Order Quantity (EOQ) model. This is a well-known tool for helping stock controllers decide how much to order. The standard EOQ assumes that demand is steady and deliveries arrive on time. The new model is more realistic and recognises that demand and supply can vary from day to day.
In their model, each day's demand and supply are treated as a series of simple "yes or no" events: a unit is either sold or not, and a unit is either delivered or not. In probability terms, these are called Bernoulli trials. When combined over several days, familiar statistical patterns emerge in the distribution for demand and the geometric distribution for delivery times. This approach allows the model to capture both steady daily sales and highly irregular, demand.
The researchers were able to calculate an exact "steady-state" picture of how much stock a business is likely to have on hand in the long run, after short-term fluctuations even out. To do this, they used a Markov chain, a type of mathematical model in which the next step depends only on the current state, not the full history.
From this steady-state analysis, the model gives precise numbers for important measures: the average inventory level, how long an inventory cycle lasts, how often the business runs out of stock, and the "fill rate". Fill rate is the share of customer demand that can be met immediately from stock. One key result is a new formula for average inventory that has been shown to work better than the current estimation tools, especially when demand is patchy and deliveries are unreliable.
The researchers explain that this matters in many real-world situations where lost sales are permanent. A supermarket cannot sell yesterday's spoiled fruit, a retailer cannot ship a promotional item after the promotion ends, and a factory may lose an urgent order for spare parts if they are not in stock at the moment they are needed. In all such cases, even small errors in estimating average inventory can have significant financial and reputational costs.
Gebennini, E., Grassi, A. and Santillo, L.C. (2025) 'Discrete-time EOQ with lost sales, binomial demand, and geometric lead time: inventory level distribution and performance analysis', Int. J. Integrated Supply Management, Vol. 18, No. 5, pp.1–35.
DOI: 10.1504/IJISM.2025.148011 - Rivers under pressure
Research in the International Journal of Hydrology Science and Technology has shown that conventional approaches to measuring water storage across Europe's complex river systems may significantly under-represent the scale and severity of changes linked to climate change.
The Earth's gravitational field is not uniform, it changes slightly depending on the presence of mountains, where the oceans, are even levels of groundwater. Indeed, when large amounts of water move through heavy rainfall, melting glaciers, or groundwater depletion, they change the local gravitational field by a small amount. Conventionally, these changes have been detected by a technique known as satellite gravimetry. NASA's GRACE (Gravity Recovery and Climate Experiment) and GRACE-FO (Follow-On, its successor), use satellites flying in tandem to measure the tiny gravitational changes. In turn, the technique can then be used to monitor changes in natural water storage in a region, such as Europe.
However, the new research has compared traditional and data-driven filtering techniques used to interpret data from GRACE-FO. The researchers found that newer, model-independent approaches offer more accurate results, particularly in capturing droughts and floods in Europe's complex system of rivers.
The main issue with the old approach is that the data from the satellites has low spatial resolution and is affected by signal interference from neighbouring regions. This latter issue, known as leakage, makes interpreting the raw data a technical challenge. To reduce noise and clear up the signals, researchers apply mathematical filters, such as Gaussian smoothing. The new study shows that this conventional method can introduce errors in regions with irregular geography, such as coastlines and densely interlaced river basins, characteristics common across the continent of Europe.
The researchers have evaluated two data-driven techniques: the "method of scale" and the "method of deviation," which use only the satellite data rather than putatively biased external hydrological models. They showed that method of scale reduce measurement uncertainty to less than 15%.
This improvement has practical implications for understanding three European river basins, the Rhone in France, the Neman on the Belarus-Lithuania border, and the Vuoksi-Neva in Finland and Russia. The new approach provides a clearer and more precise view of how water levels respond to extreme weather events.
Europe is already experiencing more frequent and severe droughts, with disproportionate effects on its smaller river basins. These catchments are limited in water storage and often highly sensitive to climatic fluctuations, but nevertheless form the backbone of many regional water systems. More accurate monitoring is critical to policymakers attempting to respond to changes.
Lenczuk, A. (2025) 'Impact of spatial filtering to GRACE-FO-derived TWS changes: a case study of European river basins', Int. J. Hydrology Science and Technology, Vol. 20, No. 5, pp.1-30.
DOI: 10.1504/IJHST.2025.147953 - The long and winding road to net-zero
Research in the International Journal of Shipping and Transport Logistics has analysed four decades of freight transportation data across the USA and shows how different modes of transport contribute to carbon dioxide emissions. The work reveals a complex picture wherein transportation is indeed a major source of greenhouse gases, alongside electricity generation, but air, rail, road, water, and pipeline make different contributions to the problem.
The researchers used an Autoregressive Distributed Lag model to consider both short-term and long-term relationships between transportation activity and emissions. This allowed them to smooth out the information even when the underlying data are uneven. Such a level of detail is uncommon in climate and transport studies, which often focus on a narrower time frame or fewer transport categories.
Surprisingly, a negative relationship between both air and pipeline transport and carbon emissions over the period 1980 to 2022 showed that even as these two forms of freight activity increased, emissions actually declined slightly: by 0.03% for every 1% rise in air transport activity, and by 0.06% for pipeline transport. Such figures seem small, but they could be statistically meaningful when scaled across the vast transport systems of the USA and over several decades. The findings suggest that adoption of cleaner technologies, especially in aviation, is having an effect. Innovation in the road and rail sectors could reap similar rewards, the research suggests.
The various interdependencies suggest that emissions, energy consumption, and transport activity are all so closely intertwined that coordinated policy responses is now essential rather than isolated reforms if we are to achieve net-zero.
The researchers suggest that by breaking down the environmental impact of individual transport modes, it is possible to develop more targeted climate strategies. For example, expanding pipeline infrastructure or accelerating the rollout of sustainable aviation technology may deliver greater emissions reductions than blanket policies applied across all transport sectors.
Ergen, H., Aslan, A. and Ayvaz, E.E. (2025) 'Can air transportation reach to zero carbon emissions: comparative econometric analysis between transportation modes in the USA', Int. J. Shipping and Transport Logistics, Vol. 21, No. 1, pp.71–99.
DOI: 10.1504/IJSTL.2025.147885 - Speak easy
Research conducted at a Chinese university and reported in the International Journal of Computational Systems Engineering has looked at how machine learning and big data techniques might be used to identify the most influential factors in English language learning for non-English majors. The researchers analysed the academic progress of 1,805 students using a refined machine learning model and found that student motivation was the single most important driver of improvement, outweighing even the teaching methods or mode of instruction.
The findings emerged from an analysis using an advanced decision tree algorithm, an enhanced CHAID (Chi-squared Automatic Interaction Detector) decision tree coupled with a genetic algorithm that filters out irrelevant data and has improved predictive accuracy.
A decision tree is a machine learning model that maps the relationships between variables in a branching format. The CHAID variant is particularly suited to education research, as it handles categorical variables well, including learning environments and teaching styles. By enhancing the CHAID algorithm with genetic programming, the researchers were able to evolve the decision model iteratively, improving its ability to identify key patterns in the student data.
The primary metric analysed was the percentage of students making notable progress. Just under 20 percent of students in the sample, 352 individuals, met this threshold. The model was then tasked with identifying what differentiated these students from the majority.
The answer, the researchers found, lay first and foremost in student motivation. Whether driven by career ambitions, academic goals, or personal interest, a student's reason for studying English had the strongest correlation with measurable improvement. Teaching methods, ranging from interactive approaches to more traditional lecture formats, ranked second in influence, followed by the mode of instruction, whether face-to-face or online learning.
English continues to serve as a bridge language in academia, commerce, and international dialogue, so effective English instruction is a priority in educational institutions around the world. The study provides a new insight into how teaching might be improved, specifically in China, but perhaps elsewhere too.
Cai, H. (2025) 'A study on the optimisation of university English teaching based on an enhanced decision tree model in the context of big data', Int. J. Computational Systems Engineering, Vol. 9, No. 12, pp.1–11.
DOI: 10.1504/IJCSYSE.2025.147788
News
Prof. Liang Zhou appointed as new Editor in Chief of International Journal of Electronic Healthcare
Prof. Liang Zhou from the Shanghai Intelligent Medical Devices and Active Health Collaborative Innovation Center in China has been appointed to take over editorship of the International Journal of Electronic Healthcare.
Prof. Shirley Mo-Ching Yeung appointed as new Editor in Chief of International Journal of Strategic Business Alliances
Prof. Shirley Mo-Ching Yeung from Gratia Christian College in Hong Kong has been appointed to take over editorship of the International Journal of Strategic Business Alliances.
Prof. Xu Zheng appointed as new Editor in Chief of International Journal of Information Technology, Communications and Convergence
Prof. Xu Zheng from Shanghai Polytechnic University in China has been appointed to take over editorship of the International Journal of Information Technology, Communications and Convergence.
Associate Prof. Marco Valeri appointed as new Editor in Chief of International Journal of Intercultural Information Management
Associate Prof. Marco Valeri from Niccolò Cusano University in Italy has been appointed to take over editorship of the International Journal of Intercultural Information Management.
Associate Prof. Jia-Ning Kang appointed as new Editor in Chief of International Journal of Renewable Energy Technology
Associate Prof. Jia-Ning Kang from the Beijing Institute of Technology in China has been appointed to take over editorship of the International Journal of Renewable Energy Technology.