2020 Research news

Virtual reality could be used as a powerful marketing tool for urban tourism. Natasha Moorhouse of the Faculty of Business and Law at Manchester Metropolitan University in the United Kingdom discusses the details in the International Journal of Technology Marketing.

"In an increasingly complex and global marketplace, it is vital that urban tourism destinations develop novel marketing strategies to differentiate, remain competitive, and ultimately attract and retain visitors to facilitate long-term tourism growth," she writes. Virtual reality has been long anticipated in tourism marketing, offering those selling destination experiences the opportunity to share the wonders of different places from the comfort of the travel agent office or even the holidaymaker's home. But, adoption has been slow despite the potential. Marketers need to understand better the possibilities of these tools as well as their limitations in order to give consumers the best opportunities.

Moorhouse's work contributes exploratory work that could offer valuable insights into the various factors associated with virtual reality tools in this context. She puts particular emphasis on the marketing of urban destinations. Virtual reality will remain a challenge for many tour operators and the public perception of such systems may well slow the uptake. However, there is also the potential for embedding virtual reality into the detailed planning of a trip allowing the traveler to investigate places in detail before they map out their itinerary.

Of course, it might be in the age of lowering our collective carbon footprint, that virtual reality tourism could be a less costly alternative to travel, both financially and environmentally.

Moorhouse, N. (2019) 'Virtual reality as an urban tourism destination marketing tool', Int. J. Technology Marketing, Vol. 13, Nos. 3/4, pp.285–306.
DOI: 10.1504/IJTMKT.2019.104599

Data mining and extraction of knowledge from disparate sources is big data, big business. But, how does the search software cope with entities that are mentioned where only part of their name is used or a name is hyphenated when it normally isn't? Research published in the International Journal of Intelligent Information and Database Systems reveals details of a new approach to improving named entity recognition and disambiguation in news headlines.

Jayendra Barua and Rajdeep Niyogi of the Department of Computer Science and Engineering, at the Indian Institute of Technology, in Roorkee, Uttarakhand, India, explain that their approach to such an analysis of current news headlines builds on a trained algorithm that has been taught to remove the hyphens and complete incomplete names to remove ambiguity.

The team's evaluation of their novel approach shows that it works with approximately 10 percent greater accuracy than conventional systems and so could improve the automated retrieval of news associated with particular companies, organizations, events, public figures, and other entities of interest to those data mining the news. The system works well with newsfeeds, such as the RSS type of newsfeed generated by regularly updated websites. Headlines from such sources might commonly be longer than conventional newspaper headlines but are nevertheless succinct, commonly being ten or fewer words long. Each word might then be important in a data mining context and so disambiguation is critical.

Barua, J. and Niyogi, R. (2019) 'Improving named entity recognition and disambiguation in news headlines', Int. J. Intelligent Information and Database Systems, Vol. 12, No. 4, pp.279–303.
DOI: 10.1504/IJIIDS.2019.104530

A new study by researchers in the USA suggests that the use of social media can sometimes have a negative impact on a work project and sometimes correlate positively with success. Writing in the International Journal of Information Technology and Management, the team suggests that using one of the most well-known social media systems, Facebook can have a negative effect on project success whereas LinkedIn has a positive effect.

Joseph Vithayathil of the School of Business at Southern Illinois University Edwardsville, Majid Dadgar of the School of Management at the University of San Francisco, and Kalu Osiri of the College of Business at the University of Nebraska-Lincoln, based their conclusions on an empirical study that analysed the relationship between the use of social media at work and project success at work.

It is well known to employers and employees that workers somehow find time during working hours to use Facebook, LinkedIn, personal Google Mail, Youtube, and many other apps and services unrelated to their work. There are numerous examples of employees being fired for using online services during the working day for personal reasons, such as online shopping, sharing photos and updates, and simply chatting to friends. The rationale is that the use of such services will inevitably have a detrimental effect on work and project success, individual implications for morale aside. Social media use continues unabated regardless of employer perception.

However, the US team has shown that for educated employees their use of LinkedIn, which is often considered a more business and work-related social media platform, correlates positively with project success at work. It may well be that this particular social media service is considered less flippant than others and is used for creating and building contacts at the professional level as well as gaining information pertinent to one's employment.

Vithayathil, J., Dadgar, M. and Osiri, J.K. (2020) 'Does social media use at work lower productivity?', Int. J. Information Technology and Management, Vol. 19, No. 1, pp.47–67.
DOI: 10.1504/IJITM.2020.104504

Millions living on the Indian sub-continent aspire to ownership of the technological breakthroughs, smartphones, tablet computers, etc that are now almost ubiquitous in other countries. The question of sustainability arises as does the notion of a so-called "green" economics when considering the huge numbers involved.

A new report in the International Journal of Green Economics, discusses one aspect of technology that might allow such issues to be addressed to some extent. Namely, the idea that a large proportion of the population with disposable income is keen to own and use such technology but also quite well aware of the consequences in terms of material resources, waste and pollution, and climate change. Might those born in the two to three decades from the mid-1960s onwards, the so-called "Generation X" and their successors the "Millennials" perhaps be more inclined to take a refurbished mobile phone rather than a brand-new gadget in the name of "saving the planet".

Prathamesh Mhatre formerly of Ramaiah University of Applied Sciences in Bangalore and Hosur Srinivasan Srivatsa of the M.S. Ramaiah University of Applied Sciences, in Karnataka, India, point out that in the face of consumer pressure born of environmental concern, many companies have been forced to implement refurbishment, recycling, and reuse strategies. This not only gives them a new market but will hopefully have the benefits that consumers are hoping to see in terms of an improved environment.

The team surveyed people born after the so-called "Baby boom" of 1946 to 1964, thus during the approximate periods 1964 to 1980 and then onwards to about 1997, representing "Generation X" and the "Millennial" generation. They looked at purchase intention of people in those two groups living in metropolitan cities of India and analysed their data using Structural Equation Modelling.

"Attitude towards refurbishment, perceived risk and perceived benefit have a significant impact on the purchase intention of Generation X consumers," the team found. Gen X consumers seek direct benefits from purchasing refurbished phones, in other words. "By contrast, the results for Millennials show that product knowledge, perceived risk, attitude towards refurbishment and subjective norm significantly impact their purchase intention, the team reports. The results contradict earlier studies that suggested that behavioural control does not affect purchase intention and suggests that theoretical models do not always assess different demographics correctly.

Mhatre, P. and Srivatsa, H.S. (2019) 'Modelling the purchase intention of millennial and Generation X consumers, towards refurbished mobile phones in India', Int. J. Green Economics, Vol. 13, Nos. 3/4, pp.257–275.
DOI: 10.1504/IJGE.2019.104512

A new approach to encryption could improve user perception of cloud computing services where the users are concerned about private or personal data being exposed to third parties. Writing in the International Journal of Cloud Computing, the team outlines a proposed homomorphic encryption system.

Homomorphic encryption was developed more than a decade ago and represented something of a significant breakthrough in security. By definition, it allows computations to be carried out on a ciphertext (the user's data in the cloud service, for instance), generating an result that is still encrypted but when decrypted by the user matches exactly the result that would be obtained if the same computational operations had been carried out on the user's plain-text as opposed to the uploaded ciphertext. It is thus very useful for ensuring the privacy of data uploaded to cloud and other outsourced computer services.

Despite all the benefits of cloud computing, the very nature of the services wherein a user by necessity must share data with a third party, the cloud service provider, means that there are endless issues of trust. Indeed, many users have not adopted cloud services because they recognise that those services being in a different domain to their own personal or private system offers malicious third parties an opportunity to access their data in a way that would not be possible if that data were held only on the user's domain. The use of sophisticated tools such as homomorphic encryption adds a layer or reassurance that should open up cloud services to all but the most neurotic of user at least within limits.

Swathi, V. and Vani, M.P. (2019) 'Secure cloud computing using homomorphic construction', Int. J. Cloud Computing, Vol. 8, No. 4, pp.354-370.
DOI: 10.1504/IJCC.2019.104498

Scientists working in medical research, biology, cellular studies, and in understanding bacteria and other pathogens often need to know about temperature rises and falls in the systems on which they focus. Many processes involve heat production and tracking those changes can get to the core of understanding a process, diagnosing a disease or perhaps investigating whether a pharmaceutical, such as an antibiotic, will work.

Now, Joohyun Lee and Il Doh of the Korea Research Institute of Standards and Science, in Daejeon, South Korea, have developed a tiny device that measures otherwise undetectable heat changes. They describe their "chip calorimeter" in the International Journal of Nanotechnology. The devices is based on a thermopile made from bismuth and aluminium and can detect sub-microwatt changes in the energy levels, and thus the heat generated by very small scale systems such as cell samples or bacterial cultures.

The chip calorimeter measures 8 by 10 millimetres and comprises four identical measurement units. A platinum electrode to generate heats in the centre and two thermopiles on both sides of the heater and maintains the device at a known temperature within a range of 20 millikelvin, this is technically the furnace and acts as a baseline for the system so that any heat increase from a sample can be detected. The whole device is supported on a membrane of silicon nitride just 1 micrometre thick. "Any heat generation by sample or heater in the area of the inner thermopile connection induces temperature difference between the outer and the inner connections so that it produces voltage signal measurable with a nanovoltmeter," the team explains.

The chip calorimeter could ultimately be employed in measuring metabolic heat of cells for antibiotic research, changes in environmental samples, and temperature changes associated with disease for diagnosis, the team writes.

Lee, J. and Doh, I. (2019) 'Development of chip calorimeter based on Bi/Al thermopile for biological sample measurement', Int. J. Nanotechnol., Vol. 16, Nos. 4/5, pp.281–288.
DOI: 10.1504/IJNT.2019.104473

Predicting the damage caused by a hurricane might be possible thanks to an analysis of semantic web resources, according to work published in the International Journal of Computational Science and Engineering.

Quang-Khai Tran and Sa-kwang Song of the Department of Big Data Science at the University of Science and Technology in South Korea, explain that they have created an algorithm trained with reported damage from 48 sites in the USA hit by five different hurricanes. The algorithm can then show the damage that would be seen six hours after landfall of other hurricanes based on the statistics. It works well even with sparse and incomplete data sets, the team reports, which could be important in the face of climate change and very variable weather reporting.

"[The system] was able to estimate the damage levels in several scenarios even if two-thirds of the relevant weather information was unavailable," the team writes. Of course, additional information and training can only improve the system.

In the current version of the algorithm, the team explains that their statistical components should ultimately be able to cope with real-time streaming data with some additional development of a kind outlined in the paper. The system might then be able to predict damage should we once more see hurricanes of the scale and devastation of Katrina in the USA in 2005, cyclone Nargis in Myanmar in 2008, and super typhoon Haiyan in the Philippines in 2013.

Tran, Q-K. and Song, S-k. (2019) 'Learning pattern of hurricane damage levels using semantic web resources', Int. J. Computational Science and Engineering, Vol. 20, No. 4, pp.492–500.
DOI: 10.1504/IJCSE.2019.104435

Researchers in China have investigated what we mean by "information overload" in the context of a social media application, WeChat. Their findings have implications for those who use and run such services as well as other researchers in the field and psychosocial practitioners.

Writing in the International Journal of Mobile Communications, the team reports how the amount of information received and the length of content correlates with user perceptions of information overload as one might expect. However, the number of subscriptions within the service that a user has was not a significant factor in this perception. However, the perception of information overload was associated with negative emotions and an increased (but ongoing) intention to discontinue usage. Negative emotions and this urge to disconnect from the service was higher with a higher level of experience.

Information overload has been defined as the point at which users of any given service receive so much information in a short space of time that they no longer have the capacity to process all of that information satisfactorily and this leads to stress or anxiety and diminished decision-making ability for those people.

"Living in a [so-called] digital society, we are bombarded with information whether or not we actively seek it," the team writes. "We are all affected by the increasing number of sources from which information emanates." They add that "Recognising the antecedents and consequences of information overload can help us to prevent it or at least deal with it."

Zhang, X., Ma, L., Zhang, G. and Wang, G-S. (2020) 'An integrated model of the antecedents and consequences of perceived information overload using WeChat as an example', Int. J. Mobile Communications, Vol. 18, No. 1, pp.19–40.
DOI: 10.1504/IJMC.2020.104419

From a philosophical point of view, we cannot reconcile a world in which so many people are suffering from malnutrition and starving for want a few grains and yet others are killing themselves through obesity.

Now, L. Manning of the Food Policy and Management Food Science and Agri-Food Supply Chain Management at Harper Adams University, in Newport, Shropshire, and J. Kelly of the Aston Business School at Aston University, in Birmingham, UK, discuss how we might locate the social responsibility for obesity in the context of evolving norms. Writing in the International Journal of Innovation and Sustainable Development, the team suggests that most countries have experienced a significant increase in the incidence of obesity in their general population over the last two decades. "Indeed, the condition is now so common, commentators conclude that obesity has become normalised and no longer attracts social opprobrium," the team writes.

Obesity comes with many morbidities and an increased risk of premature death due to a greater incidence of many serious health conditions. Governments and regulators have looked at how individuals should become responsible for their own health but have also applied pressure to food and drink manufacturers to take some of the responsibility for providing citizens with healthier choices. But, are individual and social responsibility the appropriate response to what is a growing crisis, especially as being overweight or obese is increasingly seen as normal despite the health effects.

The notions of gluttony and sloth are often raised in discussions of obesity, but these are at odds with a more enlightened view of the problem that looks at vulnerability that arises through a range of social and economic factors influence an individual's ability to make an informed choice about what they eat and drink, exercise, and their tendency to gaining weight to a problematic degree.

Manning, L. and Kelly, J. (2020) 'Obesity: locating social responsibility in the context of evolving norms', Int. J. Innovation and Sustainable Development, Vol. 14, No. 1, pp.8-29.
DOI: 10.1504/IJISD.2020.104241

Indonesian patchouli oil represents a significant share of the world market, supplying some 90 percent to the perfume industry as a common fixative agent for scents. Some 1400 tonnes are produced annually. New markets for this product may open up in medicine, given the efficacy of this substance in certain contexts for cancer chemotherapy. As such, there is an increasing need to look at its distillation from aqueous mixtures to make improved products.

Chemical engineers Chandrawati Cahyani and Wa Ode Cakra Nirwana of Brawijaya University, East Java, Indonesia have investigated how well turbidity might be used as an indicator of how far the distillation process has gone. This approach could offer a less technically onerous and so less costly test than standard gas chromatographic techniques. The team has now demonstrated that there is a linear relationship between turbidity and oil content in the aqueous emulsions of patchouli oil during distillation.

The study also demonstrated that a distillation temperature of 60 degrees Celsius is optimal and minimises the additional cost due to the need for cooling the distillate with chilled water. The process was also shown to work better at pH 4 and with the addition of a 0.2 percent concentration of sodium chloride (common salt).

"Turbidity data proved to be an excellent indicator of separation efficiency, meaning that for field operation in a rural area it will be a beneficial tool," the team reports in the International Journal of Postharvest Technology and Innovation.

Cahyani, C. and Nirwana, W.O.C. (2019) 'The use of turbidity as a separation indicator of patchouli oil from its aqueous mixture in community distillation practices', Int. J. Postharvest Technology and Innovation, Vol. 6, No. 1, pp.1–10.
DOI: 10.1504/IJPTI.2019.104174

An aqueous extract from the root of Catharanthus roseus, a plant commonly known as bright eyes, can be used as both a reducing agent as well as a capping agent for the synthesis of bactericidal silver nanoparticles, according to research published in the International Journal of Nanoparticles. Researchers from India and The Netherlands reveal details in the latest issue of the journal.

C. roseus goes by several names, the quite whimsical "bright eyes" and the more floral Cape periwinkle, graveyard plant, Madagascar periwinkle, old maid, pink periwinkle, rose periwinkle, and others. It is a member of the dogbane family, or Apocynaceae. The plants in this family can be poisonous to dogs, hence the common name.

A root extract of C. roseus specifically contains a range of bitter, nitrogen-containing alkaloids, flavonoids, carbohydrates, amino acids, and various phenolic compounds. V. Subha of the National Center for Nano Science and Nano Technology at the University of Madras, in Chennai, Tamilnadu, India, and colleagues have exploited this rich chemistry to carry out a biotransformation of silver nitrate solution to generate silver nanoparticles.

The team used UV-visible spectroscopy to investigate the products and found that surface plasmon resonance of the nanoparticles reveals a shallow peak at 490 nanometres, consistent with chemical consistency. X-ray diffraction analysis showed their crystalline nature while transmission electron microscopy showed them to be mono-disperse with a size of about 100 nanometres.

Such biotransformations to generate nanoparticles precludes the need for sophisticated technological solutions and separation techniques. It is not only more cost-efficient but avoids many of the hazardous steps in the synthesis involving toxic solvents and other reagents. Critically, the team's tests of efficacy of these biotransformed silver nanoparticles showed them to be more potent against the likes of Escherichia coli, Pseudomonas aeruginosa, and Bacillus subtilis than silver nanoparticles made by more conventional means.

The team suggests that silver nanoparticles manufactured in this way might have utility in human healthcare against bacterial pathogens. Conversely, they might also be used in some form as alternatives to bactericidal sprays for food crops and other financially important plants.

Subha, V., Ravindran, E., Kumar, A.B.H. and Renganathan, S. (2019) 'Bactericidal effect of silver nanoparticles from aqueous root extracts of Catharanthus roseus', Int. J. Nanoparticles, Vol. 11, No. 4, pp.294–304.
DOI: 10.1504/IJNP.2019.104260

Cultural heritage can be destroyed. It can decay. Once it is gone, it is gone forever, sadly. Writing in the International Journal of Global Warming, Portuguese researchers discuss the potential impact of climate change on cultural heritage and how we might lose artifacts as extreme weather has a worsening impact on our world.

Guilherme Coelho, Hugo Entradas Silva, and Fernando Henriques of the Universidade NOVA de Lisboa explain that museum pieces are subject to deterioration depending on the conditions in which they are stored, whether or not they are being exhibited or archived. The indoor climate is obviously more controllable than the outdoor, but nevertheless the increasing cost of air-conditioning, (de)humidification, and temperature control, are all likely to affect in a detrimental way how conservators look after their charges. In addition, sometimes the building themselves are the cultural heritage.

The team has now modeled various climate change scenarios to see how weather conditions might affect a building such as Lisbon's historic church of Saint Christopher. They modeled conditions in Lisbon, but also applied likely conditions associated with Seville (Mediterranean climate), Prague and Oslo (Continental climate), as well as London (Oceanic climate). They not only consider the integrity of artifacts within but also visitor comfort. After all, what is the purpose of conserving cultural heritage without allowing people to appreciate it? Ultimately, climate change is unlikely to be of benefit to house artifacts in buildings that are themselves cultural artifacts.

Coelho, G.B.A., Silva, H.E. and Henriques, F.M.A. (2019) 'Impact of climate change on cultural heritage: a simulation study to assess the risks for conservation and thermal comfort', Int. J. Global Warming, Vol. 19, No. 4, pp.382-406.
DOI: 10.1504/IJGW.2019.104268

A collaboration between scientists in India, Portugal, and the UK, has used social network analysis to solve the problem of industrial plant layout design. The approach allows the optimization of location and connectivity of personnel, jobs, and resources to make the plant as efficient as possible. The team uses maximum completion time of a job (makespan), resource utilisation, and throughput time to evaluate system performance in this context. Overall the approach offers a new way to move forward with plant design in the context of "industry 4.0".

Industry 4.0 is a phrase used to refer to the subset of the fourth industrial revolution and encompasses areas that are not normally classified as an industry, such as smart cities but more commonly is used to discuss industrial plant or factories that use machines and robots connected wirelessly to controllers and sensors and ultimately networked to allow the personnel hierarchy to view processes and production at different levels and to make decisions based on their purview.

M.L.R. Varela of the University of Minho, in Guimarães, Portugal, Vijay Kumar Manupati of NIT Warangal, in Telangana, Suraj Panigrahi of VIT University, in Vellore, Tamil Nadu, India, and Eric Costa Research of the Solent University in Southampton, UK (also at INESC Technology and Science, in Porto, Portugal) discuss details in the International Journal of Industrial and Systems Engineering.

"The experimental results revealed that the proposed SNA approach supports to find the key machines of the systems that ultimately lead to the effective performance of the whole system," the team writes.

Varela, M.L.R., Manupati, V.K., Panigrahi, S., Costa, E. and Putnik, G.D. (2020) 'Using social network analysis for industrial plant layout analysis in the context of industry 4.0', Int. J. Industrial and Systems Engineering, Vol. 34, No. 1, pp.1-19.
DOI: 10.1504/IJISE.2020.104313