Zero-Trust Blockchain Privacy Solution with Homomorphic Encryption
Book chapter, Zero-Trust Learning: Applications in Modern Network Security, 2025,
View abstract ⏷
To tackle the privacy protection and efficiency challenges within the block chain domain, this chapter introduces a privacy-enhancing solution for copyright blockchains, integrating lightweight homomorphic encryption and zero-knowledge proofs. This innovative approach enhances homomorphic encryption algorithms to streamline key generation and encryption processes while incorporating zero-trustsecurity principles to curtail unnecessary homo morphic operations. Following the application of lightweight homomorphic encryption, sensitive data is transformed into ciphertext and securely added to the blockchain ledger by nodes authorized for accounting purposes. This solution not only rectifies the inherent drawbacks of complete data trans parency in blockchain networks but also enhances operational efficiency. Security analysis underscores its qualities, including resistance to tampering and data privacy preservation. Through both performance simulations and theoretical deductions, the paper demonstrates that this approach mitigates efficiency challenges related to the distribution, sharing, and computation of private data in ciphertext form. Ultimately, this proposed methodology proves more effective in upholding customer privacy than traditional digital copyright models.
A Detailed Investigation on Potential Impact of Quantum Computing on Improving Artificial Intelligence
Nagaraj G., Upadhayaya N., Matroud A., Sabitha N., Reshma V.K., Nagaraju R.
Conference paper, International Conference on Innovative Data Communication Technologies and Application, ICIDCA 2023 - Proceedings, 2023, DOI Link
View abstract ⏷
Quantum computing is an emerging technology (QC). Several groups and research institutions are working to make quantum computing applications a reality. Artificial Intelligence (AI) is another emerging field that is becoming more established over time. The main goal of this research is to figure out how the advancement of quantum computing technology will affect AI applications. Computational methods are used to explore the growing influence of quantum computing development for a specific AI application. This study also discusses about the impact and potential of Quantum Computing (QC) on Artificial Intelligence (AI). These can be accomplished by employing cutting-edge methods to deliver usable, economically viable, and scalable technology across a wide range of industries. On the other hand, the risk of committing errors in quantum algorithms, as well as the inability to function without being super-cooled, emerges as a major challenge. Macro-energy systems and the development of sustainable energy materials are just two areas of science and engineering that quantum computing has the potential to completely transform.
A Framework for Collaborative Computing on Top of Mobile Cloud Computing to Exploit Idle Resources
Article, Annals of Data Science, 2023, DOI Link
View abstract ⏷
In the contemporary era, collaborative computing is the widely used model to exploit geographically distributed heterogeneous computing resources. Mobile Cloud Computing (MCC) offers an infrastructure that helps in offloading storage and computing resources to a public cloud. It has several advantages. However, in the context of modern Internet of Things based applications, it is essential to exploit idle resources of mobile devices as well. However, it is a challenging problem as mobile devices are resource-constrained and have mobility. Many existing MCC solutions concentrated on offloading tasks to outside mobile devices. In this paper, we investigate the possibility of using idle resources in mobile devices besides offloading tasks to the cloud. We proposed a novel algorithm known as Delay-aware Energy-Efficient Task Scheduling. The algorithm analyses locally available idle resources and schedules tasks over heterogeneous cores in mobile devices and also the cloud. In the process, it achieves strict deadlines associated with tasks and promotes energy conservation. A prototype application is built to simulate and evaluate the proposed algorithm. The experimental results revealed that the algorithm outperforms the existing baseline algorithms.
Novel hybrid artificial intelligence-based algorithm to determine the effects of air pollution on human electroencephalogram signals
Berlin M.A., Upadhayaya N., Algahtani A., Tirth V., Islam S., Murali K., Kshirsagar P.R., Hung B.T., Chakrabarti P., Dadheech P.
Article, Journal of Environmental Protection and Ecology, 2021,
View abstract ⏷
Air pollution is a serious global issue. The complicated mix of gases, fluids, and particle matter, heterogeneous, and growing pollutants from cars, manufacturers, or houses, damage the ionosphere and also damage human health. In preliminary research, the risk for short and long-term exposure to coronary heart disease in existing levels of ambient particulate matter was consistently raised. These challenges can be modelled with Artificial intelligence (AI). Their benefit is that they can resolve the issue without the realisation of the conceptual link between the input and output data under the situations of partial data. The AI approach may be used to accurately identify coronary disease, and numerous diseases to prevent heart problems. This paper provides a comprehensive analysis of the data on air pollution and cardiovascular disease by healthcare experts and regulatory authorities and also differentiates individuals with cardiovascular disease from healthy individuals easily. The decision support system focused on AI Technology can help physicians better diagnose heart patients. When chosen by the Correlation-based function subset selection algorithm, the classifier logistic regression for 20-fold cross-validation exhibited the greatest 92% accuracy.
A Lion-Whale optimization-based migration of virtual machines for data centers in cloud computing
Venkata Krishna J., Apparao Naidu G., Upadhayaya N.
Article, International Journal of Communication Systems, 2018, DOI Link
View abstract ⏷
The scalability, reliability, and flexibility in the cloud computing services are the obligations in the growing demand of computation power. To sustain the scalability, a proper virtual machine migration (VMM) approach is needed with apt balance on quality of service and service-level agreement violation. In this paper, a novel VMM algorithm based on Lion-Whale optimization is developed by integrating the Lion optimization algorithm and the Whale optimization algorithm. The optimal virtual machine (VM) migration is performed by the Lion-Whale VMM based on a new fitness function in the regulation of the resource use, migration cost, and energy consumption of VM placement. The experimentation of the proposed VM migration strategy is performed over 4 cloud setups with a different configuration which are simulated using CloudSim toolkit. The performance of the proposed method is validated over existing optimization-based VMM algorithms, such as particle swarm optimization and genetic algorithm, using the performance measures, such as energy consumption, migration cost, and resource use. Simulation results reveal the fact that the proposed Lion-Whale VMM effectively outperforms other existing approaches in optimal VM placement for cloud computing environment with reduced migration cost of 0.01, maximal resource use of 0.36, and minimal energy consumption of 0.09.
Low-cost supply chain management and value chain management with real-time advance inexpensive network computing
Article, Advances in Intelligent Systems and Computing, 2016, DOI Link
View abstract ⏷
For increased profitability and sustainability, the E-commerce systems should be backed by efficient and effective Supply Chain Management and Value Chain Management Systems. However, it is very difficult to build such systems as the functions are complex and systems are CAPEX and OPEX intensive. To overcome these problems, an inexpensive supply chain management system was designed and developed successfully with Real-time Advance Inexpensive Network Computing. The results of the research work are presented in this article.
Communication security in real-time advanced inexpensive networks
Conference paper, 2014 International Conference on Devices, Circuits and Communications, ICDCCom 2014 - Proceedings, 2014, DOI Link
View abstract ⏷
Mobile devices are being increasingly used all over the world. As a result, now a day's all important and a lot of sensitive content is being communicated increasingly over mobile networks. We have earlier devised a low cost innovative Real-time Advanced Inexpensive Network computing to accomplish several computing tasks with ordinary mobile devices. So mobile now can be used not only for serious computing besides communication. So, the need for securing the content being communicated among the mobile devices also is growing day by day. AES is the most popular, and widely cryptography based security system in the world which can be embedded into the mobile devices even. To ensure end to end security of the content and ensure the authenticity of the recipient, we have proposed a modified AES algorithm in this article. The modified algorithm was implemented and got desirable results. The existing AES encryption based hardware devices still can be used with minor changes in the software to implement the modified algorithm.
Modified Real-Time Advanced Inexpensive Networks for Critical Infrastructure Security and Resilience
Conference paper, Advances in Intelligent Systems and Computing, 2014, DOI Link
View abstract ⏷
The critical Communication and Information Technology Infrastructure required for delivery of financial inclusion services of the Government to Citizen services has been considered as the Critical Infrastructure for research work by authors. On the basis of practical research work a novel, efficient, effective and sustainable system for ensuring safety, security and resilience of critical IT infrastructure has been presented in this article. The various types of attacks and threats reported in the literature were studied, the main reasons of security threats and attacks analysed. The solution hypothesis proposed. A new mode of computation, Real-time Advanced Inexpensive Network (RAIN) Computing proposed and a multi-level scalable, low cost security solution with RAIN computing to safeguard and ensure resilience of the critical IT infrastructure for e-Governance projects presented. Though focus of experimentation is critical IT & C infrastructure, security methodology presented in the paper can be extended to safeguard other types of CI also. ©Springer International Publishing Switzerland 2014. © Springer International Publishing Switzerland 2014.
Data driven mobile commerce intelligence: With real-time advanced inexpensive network computing
Conference paper, Proceedings - 2014 International Conference on Data Science and Engineering, ICDSE 2014, 2014, DOI Link
View abstract ⏷
The present data driven business intelligent systems and associated technologies and protocols are not suitable for agri-business. Distribution of data, poor infrastructure, lower e-readiness and complexity of agribusiness problems pose special challenges for creating and managing data driven agri-business intelligence. A Novel RAIN computing based framework to provide data driven mobile commerce intelligence for agri-businessmen along with a case study was presented in this article.
An enhanced virtual carrier mechanism for MANET with multi rate traffic subdomains
Conference paper, 2013 4th International Conference on Computing, Communications and Networking Technologies, ICCCNT 2013, 2013, DOI Link
View abstract ⏷
Tremendous research effort has been made to reduce the idle time on the wireless medium of mobile adhoc network. Tradeoff among idle channel, collision and throughput was always a challenge. Hidden terminal problem creates collisions. On the other hand exposed terminal problem makes channel unnecessarily idle. Standard virtual carrier mechanism and many other alternative proposals for solving these problems make the unnecessary overhead on the nodes and delay in the network. Many of these schemes depend on the packet size to decide on a prior RTS/CTS exchange before sending data packets. In this paper we propose an enhanced virtual carrier scheme taking in to account the traffic around the nodes instead of packet size. The modified scheme splits the network virtually in to different traffic subdomains. Direct data packet transmission is allowed in low traffic scenarios. Prior RTS/CTS is exchanged only in high traffic subdomains irrespective of the data packet size. In traffic subdomains between low and high either direct transmission or prior RTS/CTS exchange is made depends on the previous subdomain of the node. We evaluated standard RTS/CTS scheme and other recent proposals and compared with our proposal. We measured the performance of our scheme against different performance metrics. Simulation result shows that our scheme outperforms the standard one in various scenarios. The outperformance of the proposed scheme is the outcome of a realistic assumption of low probability for hidden terminals in a low traffic subdomains compared to high traffic subdomains, instead of the unrealistic and pessimistic assumption of ubiquitous hidden terminals throughout the network. © 2013 IEEE.
Fault tolerance – Case study
Conference paper, ACM International Conference Proceeding Series, 2012, DOI Link
View abstract ⏷
In the clustered environment we have large number of independent components cooperating or collaborating on a computation. Any of this vast number of components can fail at any time, resulting in erroneous output. There are many techniques have been developed to resilience to these kinds of faults. This paper discuss various techniques of fault-tolerance. Success of cluster computing is based on the low cost, widely used commercial of-the-self(COTS) hardware and software. Copyright © 2012 ACM.