Interoperability Between OPC-UA and MQTT in Hybrid Industrial IoT(IIoT) Environments
Dr Satish Anamalamudi, Ms Madupuri Reddy Priya, Ms Sruthi S, V D Panduranga Sai Guptha
Source Title: 2025 17th International Conference on COMmunication Systems and NETworks (COMSNETS), DOI Link
View abstract ⏷
The Industrial Internet of Things (IIoT) is reshaping how industries operate by making it easier to gather, analyze, and use data in real time, which leads to smarter decisions and greater efficiency. However, for many industries still reliant on older systems, a significant challenge arises: getting traditional protocols like OPC-UA (Open Platform Communications Unified Architecture) to work seamlessly with modern IIoT protocols like MQTT (Message Queuing Telemetry Transport). OPC-UA is a reliable and secure protocol widely used in legacy automation systems, while MQTT, known for its lightweight, efficient messaging, is well-suited to the needs of IIoT.This paper explores practical ways to bridge these two protocols, enabling smooth data flow between legacy industrial systems and newer, cloud-based IIoT platforms. We examine specific approaches to this interoperability challenge, including using middleware gateways, deploying edge devices that support both protocols, and creating broker-based solutions that allow seamless data exchange. Each of these methods has its own benefits and challenges, and our analysis looks closely at issues such as differences in data formats, the risk of added latency, and security concerns that arise when connecting these distinct systems.Through real-world examples, we demonstrate how industries are using these techniques to connect OPC-UA-enabled devices to cloud analytics platforms, enable remote monitoring for predictive maintenance, and integrate systems from multiple vendors. We then evaluate the performance of these hybrid setups, focusing on latency, reliability, and security. The insights from this paper aim to guide industry professionals on how to best combine OPC-UA and MQTT, ensuring reliable, efficient, and secure data integration as they transition toward a connected IIoT ecosystem
Finding Influential Nodes using Mixed Centralities in Complex Networks
Source Title: 2025 17th International Conference on COMmunication Systems and NETworks (COMSNETS), DOI Link
View abstract ⏷
In an era of rapidly increasing technology, social media plays a crucial role which allows people to interact with each other, sharing knowledge, and shaping influence. Identifying the most important influencers within complex networks is essential for effectively spreading information globally. These influencers are pivotal in various domains, such as marketing, information dissemination, and opinion formation. Various centrality measures like isolating centrality, local-global centrality, closeness, betweenness, and degree centrality etc., have been developed to identify influential nodes. These measures are categorized into two types namely: local and global measures. The local measures rely solely on local information, resulting in lower accuracy, whereas global metrics uses global information, which increases computational complexity. To tackle these issues, we propose a novel Mixed Centrality(MC) metric, based on the local average shortest path along with semi local and isolating centrality. To calculate the efficiency of MC, we use SIR model used for calculating the effect of data dissemination. Subsequently, we use Kendall Taus coefficient for calculating the similarity between our method and existing centrality measures on real-world datasets, such as bio-celegans and fb-pages-politicians
Identifying influential nodes using semi local isolating centrality based on average shortest path
Source Title: Journal of Intelligent Information Systems, Quartile: Q1, DOI Link
View abstract ⏷
In complex networks, identifying influential nodes becomes critical as these networks emerge rapidly. Extensive studies have been carried out on intricate networks to comprehend diverse real-world networks, including transportation networks, facebook networks, animal social networks, etc. Centrality measures like degree, betweenness, closeness, and clustering centralities are used to find influential nodes, but these measures have limitations in implementation with large-scale networks. These centrality measures are classified into global and local centralities. Semi-local structures perform well compared to local and global centralities but efficient centrality for finding influential nodes remains a challenging issue in large-scale networks. To address this challenge, a Semi-Local Average Isolating Centrality (SAIC) metric is proposed that integrates semi-local and local information to identify important nodes in large networks, along with the relative change in average shortest path. Here, we consider extended neighborhood concept for selecting the nodes nearest neighbors along with the weighted edge policy to find the best influential nodes by using SAIC. Along with these, SAIC also consider isolated nodes which significantly impact the network connectedness by maximizing the number of connected components upon removal. As a result SAIC differentiates itself from other centrality metrics by employing a distributed approach to define semi-local structure and utilizing an efficient edge weighting policy. The analysis of SAIC has been performed on multiple real-time datasets using Kendall taus coefficient. Using the Susceptible-Infected-Recovered (SIR) and Independent Cascade(IC) models, the performance of SAIC has been examined to determine maximum information spread in comparison to the most recent metrics in some real-world datasets. Our proposed method SAIC performs better in terms of information spreading when compare with other exisiting methods, with an improvement ranging from 4.11% to 17.9%
Blockchain and AI for Educational Data Analytics in the Modern Education System
Source Title: Blockchain and AI in Shaping the Modern Education System, DOI Link
View abstract ⏷
This chapter explores the transformative potential of integrating blockchain and artificial intelligence (AI) technologies within educational data analytics. It begins by examining blockchain's capacity to enhance data security, streamline record-keeping, and ensure transparent credential verification. Concurrently, it analyzes AI's role in enabling adaptive learning, predictive modeling, and insightful data analysis to improve student outcomes and optimize educational strategies. The chapter further evaluates the synergistic benefits of combining blockchain and AI, proposing a robust framework to address prevalent challenges in the education sector, including data privacy, security, and personalized learning. By securing student records through blockchain's immutability and enhancing personalized learning experiences via AI-driven analytics, the chapter presents a comprehensive approach to modernizing educational systems. Additionally, it addresses technical challenges such as scalability and interoperability, alongside ethical considerations like data privacy, consent, and algorithmic bias. The chapter concludes with a call for collaborative efforts among educators, technologists, and policymakers to leverage these technologies, navigate their challenges, and fully realize their potential in revolutionizing education
Deep Learning Approaches for Intelligent Cyber Threat Detection in Modern Education Systems
Source Title: Blockchain and AI in Shaping the Modern Education System, DOI Link
View abstract ⏷
In the ever-evolving landscape of modern education systems, the integration of technology has become ubiquitous, opening new avenues for teaching and learning. However, this increased reliance on digital platforms has also given rise to unprecedented cybersecurity challenges, necessitating advanced detection mechanisms to safeguard sensitive educational data. This book chapter explores the application of deep learning approaches for intelligent cyber threats detection in the context of the modern education system. The chapter begins by providing a comprehensive overview of the evolving cyber threat landscape within educational institutions, highlighting the diverse range of attacks targeting student records, intellectual property, and critical infrastructure. It emphasizes the need for proactive and adaptive cybersecurity measures to counteract these threats effectively. Subsequently, the chapter delves into the foundational principles of deep learning, elucidating its capacity to autonomously learn intricate patterns and anomalies from vast datasets. Various deep learning architectures, such as convolutional neural networks and recurrent neural networks are discussed in the context of their applicability to cybersecurity in education. The practical implementation of deep learning models for cyber threats detection is then explored. Case study that illustrate how these models can analyze detect malware, and identify suspicious activities, thereby fortifying the resilience of educational systems against cyber threats. In conclusion, this book chapter provides a comprehensive exploration of deep learning approaches as a potent tool for intelligent cyber threats detection in modern education systems
Leveraging Seasonal Trends and Symptomatic Data for Precise Disease Prediction Using Machine Learning
Dr Murali Krishna Enduri, Dr Satish Anamalamudi, Anjana Harshitha Somayajula., Aetesh Chagantipati., Amish Gollapudi., Maganti Suryanarayana Dattatreya
Source Title: 2025 IEEE 14th International Conference on Communication Systems and Network Technologies (CSNT), DOI Link
View abstract ⏷
Our analysis examines the amalgamation of seasonal and symptomatic data to boost the forecasting of diseases, emphasizing the need for proactive healthcare measures. The rising need for accurate, data-driven preventive healthcare solutions, especially in regions where seasonal variations significantly impact disease patterns, serves as the foundation for this study. We developed a comprehensive prediction framework using a range of machine learning models, including logistic regression, decision trees, Naive Bayes, K-nearest neighbors, support vector machines (SVM), and random forests. Our results show that logistic regression and SVM achieved high accuracy both with and without the use of SMOTE, demonstrating their effectiveness in handling imbalanced datasets. This technique links symptoms, seasonal variations, and disease patterns for precise categorization and actionable insights, enabling effective illness detection and preventive coordination. The findings of this study help foster the use of machine learning methods to improve preventive healthcare and illustrate the need to include contextual seasonal data in healthcare forecasts. Additionally, the research highlights how symptoms and seasonal variables work together dynamically, demonstrating the potential of adaptive models to assist with healthcare decision-making in real-time
An Explainable AI-Driven Hybrid Model for Enhanced Intrusion Detection in Network Security
Dr Satish Anamalamudi, Harsha Garikapati., Kausik Challapalli., Sai Venkat Ramineni., Veera Manikanta Sai Adusumilli., Roop Sai Charan Kothamasu
Source Title: 2025 Fifth International Conference on Advances in Electrical, Computing, Communication and Sustainable Technologies (ICAECT), DOI Link
View abstract ⏷
Due to ever-evolving cyber threats, there's a need for a highly robust and interpretable intrusion detection system to strengthen network security. This research proposes an Explainable Artificial Intelligence (AI)-based Hybrid Model for network intrusion detection incorporating the use of an ensemble of classifiers trained on the CICIDS2017 dataset. The model consists of preprocessing, PCA for dimensionality reduction, and feature selection to detect key attributes. A Voting Classifier combines the strengths of individual classifiers so that it performs equally well on precision, recall and F1 score. We also tune its parameters using RandomizedSearchCV to optimise the classifier. The hybrid ensemble method captures network intrusion by merging the strengths of all classifiers. To make the system very transparent, we have used SHAP analysis to explain the important features and their interpretations. This allows network security executives to find out why a decision was made. SMOTE (Synthetic Minority Over-sampling Technique) is also applied to handle class imbalance and further enhance the model robustness. The model achieves high performance with 98.58% accuracy, 98.59% precision, 98.58% recall, F1 score of 98.56%, and ROC AUC score of 99.35%, successfully distinguishing between benign and malicious traffic. This study will offer security experts a real-time security solution that is accurate, transparent and explainable. It will advance explainable AI-based IDS
Swarm Intelligence in IoT and Edge Computing
Source Title: Swarm Intelligence, Quartile: Q2, DOI Link
View abstract ⏷
Swarm intelligence plays a crucial role in enhancing the performance of IoT and edge computing. Swarm intelligence, a natural systems-inspired collective decision-making paradigm, has helped to solve most of the existing issues like channel selection, routing table optimization, and scheduling operations in IoT networks. This study discusses how swarm intelligence might improve anomaly detection, energy-efficient routing, and scalable, decentralized algorithms. IoT, edge computing, and swarm intelligence enable efficient data processing, network performance, and novel solutions to complicated issues. Swarm intelligence enhances IoT and edge computing systems, bringing new ideas and solutions for the growing environment of interconnected devices
Machine Learning and Deep Learning with Swarm Algorithms
Source Title: Swarm Intelligence, Quartile: Q2, DOI Link
View abstract ⏷
The emerging disciplines of machine learning and deep learning have brought about an innovative period, restructuring domains such as image identification, natural language comprehension, and autonomous systems. However, there are still complex optimisation and decision-making difficulties that need to be addressed. Swarm algorithms, inspired by the coordinated movement of natural swarms, are emerging as innovative solutions in this field. This study investigates the productive convergence of machine learning, deep learning, and swarm algorithms, with the goal of unleashing their combined potential. We explore the core principles and mechanisms of swarm algorithms such as particle swarm optimisation, ant colony optimisation, bee-inspired algorithms, firefly algorithms, and bat algorithms. These decentralised algorithms, inspired by nature, are highly effective in addressing various optimisation goals by imitating the self-organizing behaviours observed in natural systems. Additionally, we explore the complex incorporation of swarm algorithms into the operations of machine learning. We aim to utilise swarm-based techniques to enhance the performance and generalizability of machine learning models by optimising hyperparameter tweaking, model selection, and feature engineering. This chapter examines the use of swarm intelligence in the emerging subject of neural architecture search, which is crucial for automating the complex process of designing deep neural networks. In order to strengthen our comprehension, we undertake a journey through captivating instances and tangible implementations in fields like robotics and healthcare. This emerging discipline combines the collective intelligence of groups of organisms with the data-driven capabilities of machine learning to create innovative solutions that can improve decision-making, optimise intricate systems, and drive progress in the field of artificial intelligence
Machine Learning Approaches to Identify AI-Generated Text: a Comparative Analysis
Dr Satish Anamalamudi, Sruthi Sreekantan, T Krishna Mihir., K Venkata Sai Harsha., S Yuva Nitya., G Bala Krishna
Source Title: 2024 International Conference on Intelligent Computing and Emerging Communication Technologies (ICEC), DOI Link
View abstract ⏷
Large-scale language models (LLMs), such as the GPT-4 from OpenAI and the Pathways Language Model from Google, have become indispensable to our daily lives and work, frequently being used without our conscious knowledge. Though study indicates that even crowdsourcing professionals have trouble identifying human-generated content from AI-generated content, subtle defects in AI writing continue to make this tough to perform. Even though these models have many benefits and the potential to totally change the way people work and learn, they have also drawn a lot of attention due to their potential drawbacks. One notable use of LLMs that demonstrates their potential for task automation is the generation of academic reports or articles with little to no human input. As a result, scientists have concentrated on creating detectors to deal with possible wrongdoing related to information provided by LLM. However, current methods frequently ignore the vital component of generalizability in favor of accuracy on small datasets. Our study offers a thorough examination of machine learning (ML) techniques that are intended to differentiate text produced by artificial intelligence (AI) from language created by humans. We attempt to use a Kaggle dataset in order to gain greater insight into the data collection procedure. To address the challenge of accurately and reliably detecting text produced by artificial intelligence (AI), we explore various machine learning approaches. We examine a broad spectrum of algorithms taking into account performance measures and their generalizability to different datasets and scenarios. By providing a more thorough explanation of the significance of LLMs, the challenges they provide, and the specific focus of our study, we broaden the abstract and lay the foundation for a thorough analysis of ML algorithms for text identification
Real Estate Price Prediction: Optimized Ridge and Lasso Regression Analysis
Dr Satish Anamalamudi, Ms Madupuri Reddy Priya, Kausik Challapalli., Prashanthi Thota., Harsha Garikapati., Veera Manikanta Sai Adusumilli
Source Title: 2024 OITS International Conference on Information Technology (OCIT), DOI Link
View abstract ⏷
Our second application, developing a reliable house price prediction model, kept housing as a core part of the data world. Data munging including handling of missing values, outliers and categorical variables is a first step of this process. Using these steps, data is explored intuitively, new features like price per square meter are created and further visualization and statistical analysis is conducted. For pattern selection, we use Ridge and Lasso regression, with GridSearchCV being run to tune and optimize parameters. We assess model performance, model assumptions by examining residuals using Mean Absolute Error (MAE), Root Mean Squared Error (RMSE) and R squared metrics. Finally built, the model can be used by real estate professionals, investors and policymakers to accurately predict housing prices based on factors such as location, area type, square footage, total number of buckets and offers a highly valuable statistical analysis of these values. A number of future work directions could include further expanding the model by adding new variables, using more sophisticated algorithms, or carrying out similar analyses in other regions of the developing world to achieve higher accuracy.
Energy-Efficient UAV Path Planning Using PSO-ABC Algorithm in Obstacle-Rich Environments
Dr Satish Anamalamudi, Bodhisattwa Baidya., Atanu Mondal., Sankha Mallick
Source Title: 2024 OITS International Conference on Information Technology (OCIT), DOI Link
View abstract ⏷
A unique hybrid approach for energy-efficient path planning in two-dimensional landscapes with obstacles is presented in this article, combining Artificial Bee Colony (ABC) along with Particle Swarm Optimisation (PSO). Path planning is a critical challenge in robotics, autonomous vehicles, and various other fields. While numerous algorithms exist, many struggle to balance computational efficiency, solution quality, and energy considerations. In order to overcome these obstacles, We constructed a hybrid PSO-ABC algorithm that combines the benefits of PSO's global search capabilities with ABC's local search strengths. The algorithm optimizes paths considering multiple objectives: path length, obstacle avoidance, and energy consumption. We represent paths using a series of handle points, employing spline interpolation for smooth trajectories. The al-gorithm initializes solutions using both random and straight-line approaches, then iteratively improves them through particle up-dates and bee phases. We introduce an adaptive inertia weight for PSO and incorporate energy consumption into the cost function. Experimental results demonstrate the algorithm's effectiveness in finding feasible, energy-efficient paths in environments with randomly placed circular obstacles. The hybrid approach shows improved convergence compared to standalone PSO or ABC algorithms. Furthermore, the inclusion of energy considerations in path optimization presents a more realistic model for real-world applications. This research enriches the field of intelligent path planning by providing a stable, energy-conscious algorithm that works well in challenging settings.
Microplastic Detection in Drinking Water: A Comparative Analysis of CNN-SVM and CNN-RF Hybrid Models
Dr Satish Anamalamudi, Dr Murali Krishna Enduri, Prashanthi Thota., Kausik Challapalli., Harsha Garikapati., Veera Manikanta Sai Adusumilli
Source Title: 2024 OITS International Conference on Information Technology (OCIT), DOI Link
View abstract ⏷
The growing presence of microplastics in drinking water poses severe dangers to health and the environment, requiring enhanced detection methods. This work deals with the constraints of conventional detection methods, such as visual inspection and Raman spectroscopy, that are labour-intensive and unscalable. By contrasting CNN-SVM and CNN-RF, two highly efficient hybrid models of machine learning, the essential purpose is to improve the accuracy of detection of the microplastics. Convolutional neural networks (CNNs) extract features from images of water samples, and the method classifies them by using Support Vector Machine (SVM) and Random Forest (RF) algorithms. The study assesses the models' precision, recall, f1-score, and overall accuracy. The findings show that these hybrid models greatly enhance detection abilities, resulting in a more effective and flexible solution. Practical uses involve real-time monitoring in water treatment plants, ecological evaluations of water bodies, as well as household water filtering systems, which provide vital information for compliance with regulations and public health safety. This study provides our knowledge on the problem of microplastic pollution and indicates possible future uses in monitoring the environment and policy-making, which will help attempts to reduce the harmful effects of microplastics in drinking water.
Agamographs Using Rubik’s Cubes: Morphing Images Through Strategic Mosaic Arrangements
Dr Satish Anamalamudi, Arkopal Das., Sarbajit Manna., Bodhisattwa Baidya
Source Title: 2024 OITS International Conference on Information Technology (OCIT), DOI Link
View abstract ⏷
This paper introduces a mathematical technique for creating dynamic artworks using Rubik's cubes. By strategically arranging two images on Rubik's cubes in segmented patterns, the resulting artwork presents a captivating illusion of morphing two images when observed from different viewpoints. As viewers move around the artwork, the images seem to shift and transform, creating an engaging visual experience. The core of this approach lies in an algorithmic process that generates mosaics representing two images on Rubik's cubes, which are then seamlessly combined to form a unified artwork. The final result is presented in a simulated 3D environment, enhancing its immersive quality. The 3D GUI environment not only provides users with a set of controls to rotate and zoom the structure but also provides an FPS log to show how many frames are being generated per second and how much the programming is lagging. The paper provides a comprehensive description of the strategy employed to achieve this artistic effect, detailing the step-by-step process from image segmentation to mosaic generation and merging along with the constraints and required mathematical preliminaries. Additionally, the paper undertakes the time complexity analysis of the algorithm, which is in linear order, which guarantees the efficiency and computational requirements.
MindSight: Revolutionizing Brain Tumor Diagnosis with Deep Learning
Dr Murali Krishna Enduri, Dr Satish Anamalamudi, Mr Koduru Hajarathaiah, Rushita Gandham., Keerthi Reddy Manambakam., Navyasri Nannapaneni
Source Title: 2024 IEEE 13th International Conference on Communication Systems and Network Technologies (CSNT), DOI Link
View abstract ⏷
Brain tumors, characterized by abnormal cell growth, pose a substantial health challenge with non-cancerous (benign) and cancerous (malignant) categories. India witnesses the diagnosis of approximately 40,000 fresh instances of brain tumors annually. The rarity and diversity of tumor types make predicting survival rates challenging. Efficient identification of cerebral abnormalities is essential for the timely and effective management of neurological conditions. Exploring the application of deep learning, this study investigates brain tumor detection using a curated dataset of Magnetic Resonance Images (MRI). Utilizing this dataset, brain tumor detection is advanced through the application of diverse models, including EfficientNetB3, ResNet50, MobileNetV3, and VGG16. The study prioritizes dataset preprocessing, emphasizing data augmentation. Diverse brain tumor images contribute to model training, incorporating transfer learning from pre-trained models on extensive datasets for discerning intricate patterns in medical images. Efficiency evaluation considers computational resources, training time, and complexity. Quantitative metrics F1 score, accuracy, recall, and precision are employed to gauge model performance in classifying the tumor and non-tumor regions. In the conducted study, VGG16 demonstrated the best performance compared to all other models
Evaluation of Asymmetric Link-Based AODV Routing Protocol for Low Power and Lossy Networks in COOJA Simulator
Source Title: 2024 International Conference on Computer, Electronics, Electrical Engineering & their Applications (IC2E3), DOI Link
View abstract ⏷
Route discovery for Point-to-Point (P2P) traffic flows is critical in Low-power and Lossy Networks (LLNs), especially in home and building automation applications. Although the Routing Protocol for Low-Power and Lossy Networks (RPL), standardized by the Internet Engineering Task Force (IETF), is widely used, it struggles to efficiently handle P2P flows, leading to congestion at the Destination Oriented Directed Acyclic Graph (DODAG) root and increased packet delays. The P2P-RPL protocol aims to address this issue, but neither RPL nor P2P-RPL adequately account for asymmetric wireless links in their route computations. AODV-RPL, a protocol draft adopted by the IETF's ROLL working group, offers a reactive P2P route discovery mechanism based on Ad Hoc On-demand Distance Vector Routing (AODV), operating with RPL in its storing mode. This study assesses the performance of AODV-RPL using the Cooja simulator across various network topologies and node densities, considering both symmetric and asymmetric links. Through extensive simulations, it is demonstrated that AODV-RPL outperforms traditional RPL in Packet Delivery Ratio (PDR) and delay performance for numerous source and destination pairs. However, while AODV-RPL generally improves upon traditional RPL for P2P routes, some node pairs remain where the difference in relative hop distance is minimal
Enhancing Real Time Communication for Hearing Impaired: A CNN-Based Audio-to-captions system
Dr Satish Anamalamudi, Mr Mutupuri Shiva Shankar, Nikil Kumar Buddala., Krishna Kishore Buddi., Satyanarayana Chowdary Battula., Rohith Immadisetti
Source Title: 2024 International Conference on Intelligent Systems and Advanced Applications (ICISAA), DOI Link
View abstract ⏷
For those with hearing disabilities, real-time speech-to-text change is basic for progressing communication, particularly in energetic settings like online conferences and classes. For comprehensive and profitable communication, this transformation must be precise and speedy. Our work improves the accuracy of speech recognition and makes a software solution that converts speech to text in real-time. This research depicts a architecture based on Variable Span Output Kernel (VOSK) models and Convolutional Neural Systems (CNNs) for a real-time audio-to-captions framework. The method changes over natural sound into spectrograms so that highlights can be extracted. We implemented CNNs with numerous layers for efficient extraction and a VOSK demonstrate to adaptively handle variable speech patterns and complexities. To create accurate captions, a language model is combined with a streamlined pipeline that processes the information in real-time. As a result, those who have hearing loss can take part completely in discussions without losing vital data. The CNN and VOSK models were compared to see whether one superior fulfills the necessities of computational effectiveness and low latency whereas the CNN show reliably accomplished lower latency, making it more fitting for real-time applications where speed is critical, the VOSK model adjusts well to a assortment of speech patterns. This research is basically centered on tending to the troubles of minimizing latency and computing complexity while taking equipment restrictions into consideration, ensuring the systems effective deployment over a range of gadgets. Through illustrating the CNN models prevalent latency in real-time audio-to-text conversion, this paper makes a difference to hearing challenged have better access to communication
Enhancing Image Crispness: A Detailed System for Versatile Optimization
Dr Murali Krishna Enduri, Dr Satish Anamalamudi, Kalyan Kumar Doppalapudi., Geetha Siva Srinivas Gollapalli., Yaswanth Chowdary Thotakura., Shalom Raja Kasim.,Yaswanth Chowdary Thotakura
Source Title: 2024 International Conference on Intelligent Computing and Emerging Communication Technologies (ICEC), DOI Link
View abstract ⏷
From space shots to health checkups, improving photos is crucial. Improving the clarity of the image is crucial here. Techniques such as adjusting contrast and balancing the histogram in space may improve clarity and sharpness by directly managing pixel data. Meanwhile, Fourier Transform-based enhancement analyzes and modifies frequency portions to make precise modifications. These methods are crucial for duties like reducing sound in aerial photos and emphasizing microscopic health images. This illustrates how simple they are to employ to improve varied image quality. Choosing between spatial or frequency-based methods in real-life relies on the pictures unique properties and desired improvements. Using space-based techniques on pixel values may improve the clarity of minor patterns in a picture. Frequency-based methods provide a unique perspective of light wavelengths utilized to produce visuals. A solid balance between keeping critical sections clear and reducing unnecessary picture components is crucial. When improving photographs, this balance yields the greatest outcomes, demonstrating how these principles apply to real-life circumstances
Privacy Mechanisms for Content-Centric Network Protecting Content Names and Content Data
Source Title: 2024 6th International Conference on Computational Intelligence and Networks (CINE), DOI Link
View abstract ⏷
Information-Centric Networks (ICN) seek to transform the existing internet infrastructure by redirecting attention from hosts to data. Preserving individual privacy and safeguarding security is of utmost importance in the field of ICN. ICN caters to the increasing need for distributing content on the Internet, and many methods and platforms have been created to make privacy protection easier. These efforts prioritise both the protection of material and the privacy of users. A notable technological achievement in this field is PrivICN, which improves user Security at the networking by safeguarding the secrecy of content names and data. This study investigates The utilisation of homomorphic encryption to ensure the security of both content names and content data. Homomorphic encryption allows for performing computations on encrypted data without the need for decryption, thereby maintaining data confidentiality and security. Encrypting sensitive data is an efficient method for maintaining privacy. Ensuring privacy in ICN is crucial, and this shift in approach tackles the growing demand for secure and efficient content distribution on the Internet. This paper examines different approaches and platforms that have been created to guarantee privacy in ICN. It specifically focuses on solutions for preserving privacy from both a content-centric and user-centric perspective
Machine Learning based Hydrology
Dr Satish Anamalamudi, Dr Ch Anil Carie, Nandini Mokhamatam., Kolli Lakshmi Varshita., Vishnu Priya Manchikalapudi
Source Title: 2024 International Conference on Computational Intelligence for Green and Sustainable Technologies (ICCIGST), DOI Link
View abstract ⏷
The study follows a structured workflow involving thorough data collection, quality assessment, and visualization through Exploratory Data Analysis (EDA). Addressing challenges like missing values and outliers, the paper employs Investigative Data Analysis (EDA) to ensure dataset reliability. Class imbalance is tackled using Synthetic Minority Over-sampling Technique (SMOTE) and standard scaling for consistent feature normalization. In the modeling phase, classifiers including AdaBoost, Bagging, Gradient Boosting, Decision Tree, Extra Tree, K-Nearest Neighbors, and XGBoost undergo cross-validation. Hyperparameter optimization is performed through grid search for Gradient Boosting and Bagging classifiers. Results indicate that XGBClassifier achieves the highest accuracy (0.789634), followed by GradientBoosting-Classifier (0.785061), BaggingClassifier (0.786585), and XGBClassifier
Predicting Red Rot Disease in Sugarcane Leaves Using Deep Learning Techniques
Dr Satish Anamalamudi, Mr Mutupuri Shiva Shankar, Hemanth Perumalla., Sai Nagarjuna Morthala., Sai Harshith Vedanabhatla., Sandeep Reddy Panyala
Source Title: 2024 Third International Conference on Electrical, Electronics, Information and Communication Technologies, DOI Link
View abstract ⏷
Sugarcane is one of the most significant crops grown in India, which is the second highest producer after Brazil in the world. But one of the leaf diseases that affects the sugarcane yield is Red Rot. It must be found prior to effectively survive the illness and protect the crop. In this study, we will be training customized models using the CNN architecture to identify the red rot disease. By implementing MobileNet, Inception V3, VGG16, ResNet50 and a Hybrid Model for training them with the image dataset along with Adams, SGD (Stochastic Gradient Descent) and RMSProp (Root Mean Square Proportion) optimizers for enhancing the models performance. We evaluate each model and optimizer pairing extensively through testing and analysis. We built a Hybrid Model by the combination of Mobile Net and InceptionV3 with Adam Optimizer, attained an accuracy of 97.7%. By assisting in the creation of trustworthy techniques for early disease diagnosis in agriculture, our research ultimately aids farmers in safeguarding the yield and well-being of their crops.
Analysing Lossless Image Compression Techniques for IoT Devices
Dr Satish Anamalamudi, Kausik Challapalli., Vivek Kothamasu., Harsha Garikapati., Roop Sai Charan., Manisankar Thota.,
Source Title: 2024 Third International Conference on Electrical, Electronics, Information and Communication Technologies (ICEEICT), DOI Link
View abstract ⏷
Lossless image compression is the technique now that allows to keep the quality high as long as the file size is being minimized to meet the needs of uploading or archiving [1]. The other category consists of Deflate whose shortcomings are fast speed and lower efficiency [2]. By considering this, smaller files size might never be the case but it is the best alternative for quality images and this forms part of formats like PNG and ZIP [3]. On the other hand, the BROTLI admirably shines with its extremely high compression ratios, when dealing with repeated pattern of information [4]. On the contrast, it pertains to the extremely resource intensive, which may eventually need more computing power, as the ratio is in favor of the compression [5]. Regardless of the chosen technique, the outcome remains consistent: for a compressed file whose data original size has been truncated, having a size that ranges from pixels to a file which can work on them without giving a distorted image, targeting those who need high-end images but small file sizes [6].
Monitoring and enhancing the co-operation of IoT network rhrough scheduling function based punishment reward strategy
Source Title: PLoS ONE, Quartile: Q1, DOI Link
View abstract ⏷
The Internet of Things (IoT) has revolutionized the connectivity of physical devices, leading to an exponential increase in multimedia wireless traffic and creating substantial demand for radio spectrum. Given the inherent scarcity of available spectrum, Cognitive Radio (CR)- assisted IoT emerges as a promising solution to optimize spectrum utilization through cooperation between cognitive and IoT nodes. Unlicensed IoT nodes can opportunistically access licensed spectrum bands without causing interference to licensed users. However, energy constraints may lead to reduced cooperation from IoT nodes during the search for vacant channels, as they aim to conserve battery life. To address this issue, we propose a Punishment-reward-based Cooperative Sensing and Data Forwarding (PR-CSDF) approach for IoT data transmission. Our method involves two key steps: (1) distributing sensing tasks among IoT nodes and (2) enhancing cooperation through a reward and punishment strategy. Evaluation results demonstrate that both secondary users (SUs) and IoT nodes achieve significant utility gains with the proposed mechanism, providing strong incentives for cooperative behaviour. © 2024 Sangi et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Blockchain-Enabled SDN in Resource Constrained Devices
Source Title: Blockchain-based Cyber Security, DOI Link
View abstract ⏷
-
Enhancing Agricultural Decision-Making Through Machine Learning-Based Crop Yield Predictions
Source Title: Lecture Notes in Networks and Systems, Quartile: Q4, DOI Link
View abstract ⏷
Food production through Agriculture plays an important role in keeping the worlds population hunger-free and nations economically secure. The continuous change in land minerals, weather situation, and pesticide usage affect the yield of the crops. Farmers can choose successful crops for the season with the help of machine learning algorithms used for crop yield prediction. In this study, we forecasted agricultural production using numerous kinds of machine learning models while considering several factors that affect crop yields, such as rainfall, temperature, and pesticide use. By merging multiple separate model predictions, ensemble machine learning models improve the performance of the machine learning models. We have worked with individual models and ensemble models like SVR, RandomForestRegressor, LinearRegressor, and DecisionTreeRegressor to predict crop yield and found an ensemble solution that combines the strengths of both the stacked generalization model and the gradient boost algorithm which can provide improved accuracy and robustness in crop yield prediction. According to the findings, the ensemble solution provided an R2 score of 98 percent, which is higher than the R2 scores of 96 percent obtained using the Decision Tree Regressor and 89 percent obtained using the Gradient Boosting Regressor.
IoT Task Offloading in Edge Computing Using Non-Cooperative Game Theory for Healthcare Systems
Source Title: CMES - Computer Modeling in Engineering and Sciences, Quartile: Q2, DOI Link
View abstract ⏷
We present a comprehensive system model for Industrial Internet of Things (IIoT) networks empowered by Non-Orthogonal Multiple Access (NOMA) and Mobile Edge Computing (MEC) technologies. The network comprises essential components such as base stations, edge servers, and numerous IIoT devices characterized by limited energy and computing capacities. The central challenge addressed is the optimization of resource allocation and task distribution while adhering to stringent queueing delay constraints and minimizing overall energy consumption. The system operates in discrete time slots and employs a quasi-static approach, with a specific focus on the complexities of task partitioning and the management of constrained resources within the IIoT context. This study makes valuable contributions to the field by enhancing the understanding of resourceefficient management and task allocation, particularly relevant in real-time industrial applications. Experimental results indicate that our proposed algorithmsignificantly outperforms existing approaches, reducing queue backlog by 45.32% and 17.25% compared to SMRA and ACRA while achieving a 27.31% and 74.12% improvement in Q. Moreover, the algorithmeffectively balances complexity and network performance, as demonstratedwhen reducing the number of devices in each group (N) from 200 to 50, resulting in a 97.21% reduction in complexity with only a 7.35% increase in energy consumption. This research offers a practical solution for optimizing IIoT networks in real-time industrial settings.
Node Significance Analysis in Complex Networks Using Machine Learning and Centrality Measures
Source Title: IEEE Access, Quartile: Q1, DOI Link
View abstract ⏷
The study addresses the limitations of traditional centrality measures in complex networks, especially in disease-spreading situations, due to their inability to fully grasp the intricate connection between a node's functional importance and structural attributes. To tackle this issue, the research introduces an innovative framework that employs machine learning techniques to evaluate the significance of nodes in transmission scenarios. This framework incorporates various centrality measures like degree, clustering coefficient, Katz, local relative change in average clustering coefficient, average Katz, and average degree (LRACC, LRAK, and LRAD) to create a feature vector for each node. These methods capture diverse topological structures of nodes and incorporate the infection rate, a critical factor in understanding propagation scenarios. To establish accurate labels for node significance, propagation tests are simulated using epidemic models (SIR and Independent Cascade models). Machine learning methods are employed to capture the complex relationship between a node's true spreadability and infection rate. The performance of the machine learning model is compared to traditional centrality methods in two scenarios. In the first scenario, training and testing data are sourced from the same network, highlighting the superior accuracy of the machine learning approach. In the second scenario, training data from one network and testing data from another are used, where LRACC, LRAK, and LRAD outperform the machine learning methods.
Performance Evaluation of Machine Learning and Neural Network Algorithms for Wine Quality Prediction
Dr Murali Krishna Enduri, Dr Satish Anamalamudi, Ms Tokala Srilatha, Harika Kakarala., Asish Karthikeya Gogineni., Thadi Venkata Satya Murty
Source Title: 2023 14th International Conference on Computing Communication and Networking Technologies (ICCCNT), DOI Link
View abstract ⏷
The assessment of wine quality is of paramount importance to both consumers and the wine industry. Recognizing its impact on customer satisfaction and business success, companies are increasingly turning to product quality certification to enhance sales in the global beverage market. Traditionally, quality testing was conducted towards the end of the manufacturing process, resulting in time-consuming and resource-intensive procedures. This approach involved the engagement of multiple human experts to evaluate wine quality, leading to high costs. Moreover, since taste perception is subjective and varies among individuals, relying solely on human specialists for assessing wine quality presents significant challenges. Our research focuses on advancing the quality of wine prediction by leveraging diverse characteristics of wine. We applied various feature selection techniques and explored machine learning algorithms to identify the optimal combination of parameters for accurate wine quality prediction. This approach reduces the time and costs associated with traditional quality assessment methods and provides a more standardized and consistent evaluation process. Our findings contribute to the advancement of wine industry practices, enabling businesses to make informed decisions and deliver high-quality products that meet consumer expectations.
A Comparison of Neural Networks and Machine Learning Methods for Prediction of Heart Disease
Source Title: 2023 3rd International Conference on Intelligent Communication and Computational Techniques, DOI Link
View abstract ⏷
Heart disease is a major cause of death and disability across the world. Heart disease mortality and morbidity rates can be greatly decreased with early detection and treatment. Hence, the development of efficient and accurate methods for early diagnosis of heart disease has become a priority in the medical field. In this study, we did a comparative study of exiting supervised machine learning approaches for predicting heart disease diagnosis and also improved the accuracy of KNN by changing K values. We used a dataset that consists of a variety of features such as age, gender and other important indicators for heart disease diagnosis. We then explored and evaluated traditional ML algorithms such as logistic regression, decision tree, random forest and SVM for the predictive analysis. A number of criteria, including accuracy, precision, recall, and F1 Score, were used to assess the models' performance. This study provides evidence that ML algorithms can be used to forecast the diagnosis of heart disease. Healthcare providers and medical practitioners can utilize the outcomes of this study for early detection and management of cardiac disease. Further research will aim to analyse and evaluate additional machine learning algorithms to enhance precision and performance.
Redundant Transmission Control Algorithm for Information-Centric Vehicular IoT Networks
Dr Satish Anamalamudi, Dr Murali Krishna Enduri, Abdur Rashid Sangi., Mohammed S Alkatheiri., Chettupally Anil Carie., Mohammed A Alqarni
Source Title: Computers, Materials and Continua, Quartile: Q1, DOI Link
View abstract ⏷
Vehicular Adhoc Networks (VANETs) enable vehicles to act as mobile nodes that can fetch, share, and disseminate information about vehicle safety, emergency events, warning messages, and passenger infotainment. However, the continuous dissemination of information from vehicles and their one-hop neighbor nodes, Road Side Units (RSUs), and VANET infrastructures can lead to performance degradation of VANETs in the existing host-centric IP-based network. Therefore, Information Centric Networks (ICN) are being explored as an alternative architecture for vehicular communication to achieve robust content distribution in highly mobile, dynamic, and error-prone domains. In ICN-based Vehicular-IoT networks, consumer mobility is implicitly supported, but producer mobility may result in redundant data transmission and caching inefficiency at intermediate vehicular nodes. This paper proposes an efficient redundant transmission control algorithm based on network coding to reduce data redundancy and accelerate the efficiency of information dissemination. The proposed protocol, called Network Cording Multiple Solutions Scheduling (NCMSS), is receiver-driven collaborative scheduling between requesters and information sources that uses a global parameter expectation deadline to effectively manage the transmission of encoded data packets and control the selection of information sources. Experimental results for the proposed NCMSS protocol is demonstrated to analyze the performance of ICN-vehicular-IoT networks in terms of caching, data retrieval delay, and end-to-end application throughput. The end-to-end throughput in proposed NCMSS is 22% higher (for 1024 byte data) than existing solutions whereas delay in NCMSS is reduced by 5% in comparison with existing solutions.
AI based scheduling protocol for Cognitive Radio Adhoc Networks
Dr Satish Anamalamudi, Dr Sobin C C, Abdur Rashid Sangi., Lokeshwari Anamalamudi., Mohammed S Alkatheiri., Reddypriya Madupuri
Source Title: 2023 IEEE 4th International Conference on Pattern Recognition and Machine Learning (PRML), DOI Link
View abstract ⏷
Cognitive Radio Ad hoc Networks (CRAHNs) have emerged as a promising solution to address the spectrum scarcity problem by allowing unlicensed secondary users to opportunistically access underutilized spectrum bands. However, efficient and dynamic spectrum access in CRAHNs remains a challenging task due to the dynamic nature of the network and the unpredictable spectrum availability. This paper proposes an AI-based scheduling protocol for Cognitive Radio Ad hoc Networks (AI-SCAN) to address the spectrum access problem in CRAHNs. The protocol utilizes machine learning techniques to enable intelligent decision-making for spectrum allocation and scheduling. Simulations are conducted to evaluate the performance of AI-SCAN in comparison to existing scheduling protocols. The results demonstrate that AI-SCAN achieves superior performance in terms of spectrum utilization, network throughput, and fairness among secondary users. The protocol effectively balances the trade-off between maximizing spectrum utilization and minimizing interference, thereby enhancing the overall efficiency and reliability of CRAHNs.
Hybrid Deep Learning Approach for Information Analysis and Fake News Detection
Dr Satish Anamalamudi, Dr Sobin C C, Krishna Kishore Buddi., Lokeshwari Anamalamudi., Shiva Shankar Mutupuri., Reddypriya Madupuri
Source Title: 2023 IEEE 15th International Conference on Computational Intelligence and Communication Networks (CICN), DOI Link
View abstract ⏷
The era of digital information has witnessed the alarming surge of disinformation, posing a serious challenge to the reliability of information and societal cohesion. In response, the need for robust and efficient methods to identify false information has become paramount. This paper introduces an innovative Hybrid Deep Learning approach that harnesses the capabilities of deep neural networks to improve the precision and dependability of information analysis within information systems. This approach combines Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), with a particular emphasis on the examination of textual and multimedia content. By integrating both CNNs and RNNs, this method adeptly captures spatial and temporal features, enabling the evaluation of textual and visual content from diverse sources. Moreover, the approach incorporates attention mechanisms to assess the relevance of different elements within the content, facilitating the fine-grained differentiation between authentic and deceptive information. A comparative examination of several methodologies has been carried out. The results exhibit a substantial enhancement in the precision of disinformation identification when compared to conventional machine learning methods and standalone deep learning models.
Machine Learning based Malware Detection for IoT Networks
Dr Satish Anamalamudi, Mr Mutupuri Shiva Shankar, Vallabhaneni Preetam., Dudugu Aditya., Alapati Harishitha., Sruthi Sivarajan
Source Title: 2023 IEEE 15th International Conference on Computational Intelligence and Communication Networks (CICN), DOI Link
View abstract ⏷
The advent of the Internet of Things (IoT) has instigated significant transformations within the healthcare sector. It has paved the way for cutting-edge healthcare monitoring systems. To ensure the dependability and effectiveness of these systems, it becomes imperative to engage in performance modeling and analysis. This research centers on the assessment of IoT-enabled healthcare monitoring system performance, taking into consideration diverse factors including IoT network infrastructure, communication protocols, data processing, and storage. Furthermore, the study employs machine learning techniques to appraise healthcare data collected through IoT networks. Through simulation-based methodologies, this investigation models the behavior of the system and evaluates its performance using metrics like response time, throughput, and reliability. The discoveries from this study furnish invaluable insights into the performance of the system, pinpointing areas where enhancements can elevate the overall efficiency of healthcare monitoring systems. Consequently, this research makes substantial contributions to the enhancement of efficient IoT-enabled healthcare monitoring systems, which in turn offer dependable and cost-effective healthcare solutions.
Advancements in Sentiment Analysis: A Deep Learning Approach
Dr Satish Anamalamudi, Dr Murali Krishna Enduri, Mr Koduru Hajarathaiah, Yogeshvar Reddy Kallam., Lovely Yeswanth Panchumarthi., Lavanya Parchuri
Source Title: 2023 IEEE 15th International Conference on Computational Intelligence and Communication Networks (CICN), DOI Link
View abstract ⏷
Sentiment analysis, a pivotal discipline in the digital era, revolves around the nuanced task of categorizing user sentiments within textual data. This research embarks on an exhaustive exploration of diverse sentiment analysis models, comprising Convolutional Neural Networks (CNNs), Long Short-Term Memory Networks (LSTMs), Support Vector Machines (SVMs), and a Baseline Model. Through a rigorous comparative analysis of their performance across varied datasets, this study illuminates the unique strengths and limitations inherent to each model. Furthermore, the research extends beyond the realm of academic inquiry to unveil the practical applications of sentiment analysis. It underscores the profound impact of sentiment analysis in contemporary datadriven decision-making, illustrating its significance across multifaceted domains such as marketing, social media monitoring, finance, customer service, and public sentiment analysis. This investigation seeks to empower stakeholders with invaluable insights, thereby facilitating informed choices and strategies in the ever-evolving digital landscape.
Performance Modeling and Analysis of Internet of Things Enabled Healthcare Monitoring Systems
Dr Satish Anamalamudi, Mr Mutupuri Shiva Shankar, Ms Sruthi S, Saithej Singhu., Lokeshwari Anamalamudi., Shiva Shankar Mutupuri., Chettupally Anil Carie
Source Title: 2023 IEEE 15th International Conference on Computational Intelligence and Communication Networks (CICN), DOI Link
View abstract ⏷
The rise of the Internet of Things (IoT) has brought about significant changes in the healthcare sector, leading to the development of more advanced healthcare monitoring systems. To ensure that these systems are dependable and effective, it is crucial to conduct performance modeling and analysis. This research is focused on evaluating the performance of healthcare monitoring systems enabled by IoT. It takes into account various factors, including IoT network infrastructure, communication protocols, data processing, and storage. Moreover, the study utilizes machine learning technique that is Artificial Neural Network to assess healthcare data collected through IoT networks. By employing simulationbased methods, this investigation models how the system behaves and evaluates its performance using metrics like response time, throughput, and reliability. The findings from this study provide valuable insights into the performance of these systems, pinpointing areas where improvements can enhance the overall efficiency of healthcare monitoring systems. As a result, this research makes a substantial contribution to the improvement of efficient IoT-enabled healthcare monitoring systems, which, in turn, offer dependable and cost-effective healthcare solutions.
Tackling Disinformation: Machine Learning Solutions for Fake News Detection
Dr Satish Anamalamudi, Sangi A R., Nagaram J., Sudulagunta A., Talari S S., Malla V., Enduri M
Source Title: ACM International Conference Proceeding Series, Quartile: Q3, DOI Link
View abstract ⏷
Fake news is termed as the news that spreads via the internet very fast which is not true i.e., false news. Since we are in a society of modern living culture, we will be attracted to the trend easily. So, taking this as an advantage some business traders make this news as their profit by clicking on that fake news. We can observe these types of issues in areas like Political issues, medical issues, Job rackets, etc. The tremendous increase in the spreading of fake news may result in less hope for real news. The main goal is to create a resilient and effective system with the ability to automatically differentiate between authentic and falsified news articles. We can find whether the news is fake or real through machine learning algorithms with greater methodology. We have selected and implemented a few datasets using machine learning algorithms like Decision Tree, Naive Bayes, SVM, Random Forest, Logistic Regression, and Passive aggressive classifier. Further, we come up with the algorithm which gives the highest performance measures. The results from the experiments exhibit encouraging performance metrics in identifying fake news, highlighting machine learning's potential in countering misinformation. The outcomes imply that blending various feature types and advanced algorithms leads to better performance when contrasted with individual methods. We have applied these ML methods on two datasets and achieved accuracy of 99.69% with SVM, 99.06% with Logistic Regression and 99.64 © 2023 ACM.
Improving Skin Disease Diagnosis with Deep Learning: A Comprehensive Evaluation
Source Title: 2023 14th International Conference on Computing Communication and Networking Technologies (ICCCNT), DOI Link
View abstract ⏷
One of the deadliest maladies is skin disease. Despite it could be challenging to diagnose dermatological problems precisely. In the present study, deep learning will be used to pinpoint the skin problem described earlier. Such a tool would be more efficient than manual procedures, which are time-consuming and call for expert help. This paper's main goal is to provide an in-depth analysis of conceptual review of current advancements in deep learning-based skin disease identification. Although deep learning is gaining popularity, there are still many problems to be solved and much more study to be done. The present manual procedures for diagnosing skin diseases are known to be time-consuming because they rely on professional judgement.
A novel approach to minimize the Black Hole attacks in Vehicular IoT Networks
Source Title: ACM International Conference Proceeding Series, Quartile: Q3, DOI Link
View abstract ⏷
Vehicular Ad-hoc IoT Networks (VA-IOT) have gained significant attention due to their ability to enable the distributed data transmission between vehicles to vehicle. However, VA-IOT are susceptible to various security threats, including the Black Hole attack. With Black Hole attacks, an intruder or malicious node attracts the internet traffic by broadcasting fake messages and drops all the received packets, which can significantly impact the network's performance. To mitigate, this paper presents an new mechanism to minimize the Black Hole attack on VANETs by combining two techniques: A trust management system and an intrusion detection system. The proposed approach involves assigning trust values to each vehicle based on their past behavior and routing packets through only trusted nodes. Additionally, an intrusion detection system is used to identify malicious nodes that violate the trust threshold and to take appropriate measures. The performance of the proposed approach is outperformed in terms of achievable end-To-end throughput and minimized network delays.
An AI fuzzy clustering-based routing protocol for vehicular image recognition in vehicular ad hoc IoT networks
Source Title: Soft Computing, Quartile: Q1, DOI Link
View abstract ⏷
A vehicular ad hoc IoT network (VA-IoT) plays a key role in exchanging the constrained networked vehicle information through IPv6-enabled sensor nodes. It is noteworthy to understand that vehicular IoT is interconnection of vehicular ad hoc networks with the support of constrained IoT devices. Routing protocols in VAN-IoT are designed to route the vehicular traffic in the distributed environments. In addition, VAN-IoT is designed to enhance road safety by reducing the number of road accidents through reliable data transmission. Routing in VAN-IoT has a unique dynamic topology, frequent spectrum, and node handover with restricted versatility. Hence, it is very crucial to design the hybrid reactive routing protocols to ensure the network throughput and data reliability of the VAN-IoT networks. This paper aims to propose an AI-based reactive routing protocol to enhance the performance of the network throughput, minimize the end-to-end delay with respect to node mobility, spectrum mobility, link traffic load and end-to-end network traffic load while transmitting the vehicular images. In addition, the performance of the proposed routing protocol in terms of image transmission time is being compared with the existing initiative-taking- and reactive-based routing protocols in vehicular ad hoc IoT (VA-IoT) networks.
Liver Disease Prediction and Classification using Machine Learning Techniques
Dr Satish Anamalamudi, Dr Murali Krishna Enduri, Ms Tokala Srilatha, Mr Koduru Hajarathaiah, Sai Ram Praneeth Gunda., Botla Srinivasrao., Nalluri Lakshmikanth., Nagamanohar Pathipati
Source Title: International Journal of Advanced Computer Science and Applications, Quartile: Q3, DOI Link
View abstract ⏷
Recently liver diseases are becoming most lethal disorder in a number of countries. The count of patients with liver disorder has been going up because of alcohol intake, breathing of harmful gases, and consumption of food which is spoiled and drugs. Liver patient data sets are being studied for the purpose of developing classification models to predict liver disorder. This data set was used to implement prediction and classification algorithms which in turn reduces the workload on doctors. In this work, we proposed apply machine learning algorithms to check the entire patients liver disorder. Chronic liver disorder is defined as a liver disorder that lasts for at least six months. As a result, we will use the percentage of patients who contract the disease as both positive and negative information We are processing Liver disease percentages with classifiers, and the results are displayed as a confusion matrix. We proposed several classification schemes that can effectively improve classification performance when a training data set is available. Then, using a machine learning classifier, good and bad values are classified. Thus, the outputs of the proposed classification model show accuracy in predicting the result.
Algorithms for Finding Influential People with Mixed Centrality in Social Networks
Source Title: Arabian Journal for Science and Engineering, Quartile: Q1, DOI Link
View abstract ⏷
Identifying the seed nodes in networks is an important task for understanding the dynamics of information diffusion. It has many applications, such as energy usage/consumption, rumor control, viral marketing, and opinion monitoring. When compared to other nodes, seed nodes have the potential to spread information in the majority of networks. To identify seed nodes, researchers gave centrality measures based on network structures. Centrality measures based on local structure are degree, semi-local, Pagerank centralities, etc. Centrality measures based on global structure are betweenness, closeness, eigenvector, etc. Very few centrality measures exist based on the networks local and global structure. We define mixed centrality measures based on the local and global structure of the network. We propose a measure based on degree, the shortest path between vertices, and any global centrality. We generalized the definition of our mixed centrality, where we can use any measure defined on a networks global structure. By using this mixed centrality, we identify the seed nodes of various real-world networks. We also show that this mixed centrality gives good results compared with existing basic centrality measures. We also tune the different real-world parameters to study the effect of their maximum influence.
Integration of E-health and Internet of Things
Source Title: Blockchain Technology Solutions for the Security of IoT-Based Healthcare Systems, DOI Link
View abstract ⏷
The proliferation of healthcare-specific Internet of Things (IoT) devices opens up huge opportunities in automated healthcare management systems. Integrating the healthcare system with IoT networks is crucial due to time-critical sensitive applications. State-of-the-art IoT networks transmit the application data through nondeterministic best effort traffic flows, whereas the data from different nodes used to be scheduled in a single shared channel. On the contrary, data from healthcare systems needs to be transmitted in predetermined per-flow deterministic traffic flows to guarantee the quality of service (QoS) in terms of transmission delay and packet drops. To achieve this, the current IoT protocol stack needs to be updated with the support of deterministic traffic flows to ensure the guaranteed QoS in healthcare and medical applications. Hence, this chapter proposes the protocol aspects (scheduling and routing protocol) to integrate E-health with IoT networks to ensure predetermined traffic flows with predictable end-to-end delays.
Generalization of Relative Change in a Centrality Measure to Identify Vital Nodes in Complex Networks
Source Title: IEEE Access, Quartile: Q1, DOI Link
View abstract ⏷
Identifying vital nodes is important in disease research, spreading rumors, viral marketing, and drug development. The vital nodes in any network are used to spread information as widely as possible. Centrality measures such as Degree centrality (D), Betweenness centrality (B), Closeness centrality (C), Katz (K), Cluster coefficient (CC), PR (PageRank), LGC (Local and Global Centrality), ISC (Isolating Centrality) centrality measures can be used to effectively quantify vital nodes. The majority of these centrality measures are defined in the literature and are based on a network's local and/or global structure. However, these measures are time-consuming and inefficient for large-scale networks. Also, these measures cannot study the effect of removal of vital nodes in resource-constrained networks. To address these concerns, we propose the six new centrality measures namely GRACC, LRACC, GRAD, LRAD, GRAK, and LRAK. We develop these measures based on the relative change of the clustering coefficient, degree, and Katz centralities after the removal of a vertex. Next, we compare the proposed centrality measures with D, B, C, CC, K, PR, LGC, and ISC to demonstrate their efficiency and time complexity. We utilize the SIR (Susceptible-Infected-Recovered) and IC (Independent Cascade) models to study the maximum information spread of proposed measures over conventional ones. We perform extensive simulations on large-scale real-world data sets and prove that local centrality measures perform better in some networks than global measures in terms of time complexity and information spread. Further, we also observe the number of cliques drastically improves the efficiency of global centrality measures.
Comparative study on sentimental analysis using machine learning techniques
Dr Murali Krishna Enduri, Dr Satish Anamalamudi, Abdur Rashid Sangi., Ramanadham Chandu Badrinath Manikanta., Kallam Yogeshvar Reddy., Panchumarthi Lovely Yeswanth., Suda Kiran Sai Reddy., Gogineni Asish Karthikeya
Source Title: Mehran University Research Journal of Engineering and Technology, DOI Link
View abstract ⏷
-
Deep Learning Approaches for Detecting Psychological Instability: An Evaluation of Performance
Source Title: 2023 14th International Conference on Computing Communication and Networking Technologies (ICCCNT), DOI Link
View abstract ⏷
These days people all around the world put in a lot of effort to keep up with the rapidly growing world. Yet, as a result of this, every person has to cope with a variety of health problems, the most well-known of which is depression or stress, which can ultimately result in demise or other heinous actions. These irregularities can be referred to as psychological instability, which can be cured by pursuing some form of therapy advised by medical professionals. For this research, a dataset is taken from the internet and then data was processed using neural networks(NN) and few machine learning models were also used to check the accuracy whether the model is being more accurate and on the other hand an interface was created consisting a variety of questions which are related to the dataset that we are considering for this work and for the instability identification process we are using fuzzy techniques as the answers to these questions are created in that way and based on the answers received to questions output is determined about the level of instability that a person is having and based on that person can get the treatment required.
Augmenting data security: physical unclonable functions for linear canonical transform based cryptography
Source Title: Applied Physics B: Lasers and Optics, Quartile: Q2, DOI Link
View abstract ⏷
The vulnerability of Linear Canonical Transform (LCT)-based optical encryption system. One of the primary reasons for this is the predictable nature of the security keys (i.e., simulated random keys) used in the encryption process. To alleviate, in this work, we are presenting a Physically Unclonable Function (PUF) for producing a robust encryption key for the digital implementations of any optical encoding systems. We note a correlation function of the scattered perfect optical vortex (POV) beams is utilized to generate the encryption keys. To the best of our knowledge, this is the first report on properly utilizing a scattered POV for the optical encryption systems. To validate the generated keys, the standard Linear Canonical Transform-based Double Random Phase Encoding (LCT-DRPE) technique is used. Experimental and simulation result validates the proposed key generation method as an effective alternative to the digital encryption keys.
Find the Spreading Ability of the Influential Nodes using the IC Model in Social Networks
Source Title: 2022 14th International Conference on Computational Intelligence and Communication Networks, DOI Link
View abstract ⏷
In the world of fast-growing technology, we have social media and networks have reached a place where they tend to influence the largest percentage of the population in their respective area or language. In today's world, people are influenced by popular public figures around the world. So in this research, we will identify the most influential people on social networks so that we can easily share the information, which helps in different ways like marketing, stopping the spread of false information, cautions or hazardous information, etc. This helps us spread information to large groups of people with comparatively less capital. Finding influential people can now be done by finding influential nodes in social networks. Different researchers around the world have proposed various ways of finding influential nodes like PageRank, degree centrality, betweenness centrality, closeness centrality, etc. Of these, some come under global-structure-based and some come under local-structure-based. Our idea is for an independent cascade model to be applied to the basic centralities to test the spreading ability. We analyze the relationship between centrality values and information spread. Finally, in this research, we will discuss various centralities that help in finding influential nodes and pick the best centrality depending on the cause or situation.
An Empirical Study on Fake News Prediction with Machine Learning Methods
Source Title: 2022 14th International Conference on Computational Intelligence and Communication Networks, DOI Link
View abstract ⏷
Due to advancement in technology and distributed networking, there is huge information available on the internet. Due to this, it is possible that some users may try to post fake news through some platforms to get the financial credibility. A common user finds it difficult to differentiate the fake news in comparison with the authentic news. Due to this, a fake news can be main agenda against a particular individual, society, organization or even related to political party. To date, lot of research has been done to detect the fake news on the internet. But, most of the solutions are proposed by comparing with very few performance metrics along with limited data sets. In this work, we propose to use Decision tree, SVM, LSTM, Naive Bayes techniques to analyse and observe the behavior on different datasets. Furthermore, we compare and demonstrate the best approach through experimental analysis.
Air Quality Analysis and Forecasting Using Deep Learning
Source Title: 2022 International Conference on Computational Intelligence and Sustainable Engineering Solutions, DOI Link
View abstract ⏷
In today's world, people are more concerned about the quality of air. Since air is everywhere, we cannot escape from the pollutants of air. To keep our health safe, we need to maintain certain quality air in our surroundings. Effective air prediction is the most trending research in this era. This paper discusses some of the challenges that are faced due to lack of data resources, various concentrations of the quality of air, etc. And we propose a solution to these kinds of problems with the help of a predictive data feature extraction based on an approach of predicting the quality of air. This model is based on lightBGM which predicts PM2.5 concentration of air at 35 air monitoring systems at Beijing in the upcoming 24 hours. This high dimensionality large-scale data is collected and employed using the CNN, KNN, and random forest algorithm which is used to predict the quality of air. As you explore new data, you can use the spatial data remaining in the existing model. We can also improve the predictive accuracy of the existing model and make our model more efficient. Using the sliding window mechanism, we can mine deeper data of high dimensions, increasing the amount of information. Here we implement the comparison of the predicted values with the input actual values from the data set which was provided and prove that our model is more beneficial and superior to all other models that already exist by constructing a high-dimensional statistical analysis on the data the implementation give the effective results over the air quality management.
A Tool for Fake News Detection using Machine Learning Techniques
Source Title: 2022 2nd International Conference on Intelligent Technologies (CONIT), DOI Link
View abstract ⏷
The web and internet are very important to a very huge number of people and it has a large number of users. These users use these platforms for different purposes. There are many social media platforms that are available to these users. Any user can make a spread or post the news/message through these online social platforms. Even though the algorithms used by social media platforms are updated meticulously, they still are not efficient enough to filter out the fake news or make the essential information viral first where it is needed so that the information surrounding that specific region benefits the people living there before the news reaches out to the rest of the world. One of the biggest methods of fake news contribution is social bots. Social bots generate the content automatically and spread among the social media users. In this work, we propose an effective approach to detect fake news / false information using machine learning techniques. We provide a tool to detect fake news using Naive Bayes technique with high accuracy. We show the results on two data sets by using our tool.
Computing Influential Nodes Using the Nearest Neighborhood Trust Value and PageRank in Complex Networks
Source Title: Entropy, Quartile: Q1, DOI Link
View abstract ⏷
Computing influential nodes gets a lot of attention from many researchers for information spreading in complex networks. It has vast applications, such as viral marketing, social leader creation, rumor control, and opinion monitoring. The information-spreading ability of influential nodes is greater compared with other nodes in the network. Several researchers proposed centrality measures to compute the influential nodes in a complex network, such as degree, betweenness, closeness, semi-local centralities, and PageRank. These centrality methods are defined based on the local and/or global information of nodes in the network. However, due to their high time complexity, centrality measures based on the global information of nodes have become unsuitable for large-scale networks. Very few centrality measures exist that are based on the attributes between nodes and the structure of the network. We propose the nearest neighborhood trust PageRank (NTPR) based on the structural attributes of neighbors and nearest neighbors of nodes. We define the measure based on the degree ratio, the similarity between nodes, the trust values of neighbors, and the nearest neighbors. We computed the influential nodes in various real-world networks using the proposed centrality method. We found the maximum influence by using influential nodes with SIR and independent cascade methods. We also compare the maximum influence of our centrality measure with the existing basic centrality measures.
Automated Resume Screener using Natural Language Processing(NLP)
Source Title: 2022 6th International Conference on Trends in Electronics and Informatics, DOI Link
View abstract ⏷
Resume Screening is the process of evaluating the resume of the job seekers based on a specific requirement. It is used to identify the candidate eligibility for a job by matching all the requirements needed for the offered role with their resume information such as education qualification, skill sets, technical stuff etc. Resume Screening is a crucial stage in candidate's selection for a job role, it is the stage where the decision making is done whether to move the candidate to the next level of hiring process or not. Traditionally, this process is performed manually, but companies often receive thousands of resumes for job applications. In order to reduce the human involvement and errors, many new ways were introduced in this process. This paper discusses about one such process which is very efficient in performing Resume screening. It includes Natural Language Processing (NLP), an automated Machine Learning Algorithm for screening the resumes. This paper explains the end to end working of a python application which efficiently screens the resumes of the candidates based on the organization's requirement.
A Comparative Study on Machine Learning based Prediction of Citations of Articles
Source Title: 2022 6th International Conference on Trends in Electronics and Informatics, DOI Link
View abstract ⏷
Authors can use predictions to create very accurate estimations about the likely outcomes of a query based on past data, which can be about anything from customer churn to possible fraudulent conduct. The citation count indicates to the number of times publication has been cited. One of the most important considerations for a writer or author when publishing an article is how to make a significant effect on the content. The impact of a paper is broad, which increases the opportunity for fresh ideas and progress. Future paper citation counts will be useful for researchers in selecting representative literature because they are an important indicator for estimating possible influences of published papers. This is a regression problem. Predicting and comprehending article citation numbers, on the other hand, is a difficult problem to solve, both theoretically and empirically, as evidenced by decades of research. The influence of each work is predicted based on its previous citations. The goal is to precisely anticipate the number of citations that will be received over time. The proposed research study also provides a comparative analysis on the prediction of citations for articles.
Efficient algorithm for finding the influential nodes using local relative change of average shortest path
Source Title: Physica A: Statistical Mechanics and its Applications, Quartile: Q1, DOI Link
View abstract ⏷
In complex networks, finding the influential nodes playing a crucial role in theoretical and practical point of view because they are capable of propagating information to large portion of the network. Investigating the dynamics of information spreading in complex networks is a hot topic with a wide range of applications, including information dissemination, information propagation, rumour control, viral marketing, and opinion monitoring. In recent years, several centrality measures have been discovered to find influential nodes in complex networks. In this work, the local relative change of average shortest path (i.e Local RASP) based on the local structure of the network is being proposed. This local RASP measure of a node defined based on the local networks relative change in average shortest path when the node is deleted. Our local RASP centrality produces good results compared to degree, betweenness, closeness, semi-local, PageRank, Trust-PageRank, and RASP centralities. Our local RASP centrality measures computation time is less compared to global centrality measure RASP. It measures the information diffusion efficiently within the network through the initial seed nodes identified by the local RASP.
Finding Influential Nodes in Complex Networks Using Nearest Neighborhood Trust Value
Source Title: Studies in Computational Intelligence, Quartile: Q3, DOI Link
View abstract ⏷
Information spreading in complex networks is an emerging topic in many applications such as social leaders, rumour control, viral marketing, and opinion monitor. Finding the influential nodes plays a pivotal role for information spreading in complex network. This is because influential nodes have capable to spread more information in compared with other nodes. Currently, there are many centrality measures proposed to identify the influential nodes in the complex network such as degree, betweenness, closeness, semi-local centralities and page-rank etc. These centrality measures are defined based on the local and/or global information of nodes in the network. Sheng et al. [18] propose centrality measure based on the information between nodes and structure of network. Inspired by this measure, we propose the nearest neighborhood trust page rank (NTPR) based on structural information of neighbours and nearest neighbours. We proposed the measure based on the similarity between nodes, degree ratio, trust value of neighbours and nearest neighbours. We also perform on various real world network with proposed centrality measure for finding the influential nodes. Furthermore, we also compare the results with existing basic centrality measures.
Cooperative Caching Scheme for Machine-to-Machine Information-Centric IoT Networks
Dr Satish Anamalamudi, Mohammed Saeed Alkatheiri., Eesa Al Solami., Abdur Rashid Sangi
Source Title: IEEE Canadian Journal of Electrical and Computer Engineering, Quartile: Q1, DOI Link
View abstract ⏷
Information-centric networks (ICNs), a foreseen future Internet architecture, are focusing on the content itself rather than its hosting location. With this, the existing Internet protocol (IP)-based host-centric communication model can be transformed into a content-centric communication model with the support of in-network caching, name-based routing, and location-independent content domain names. Recently, ICN is proposed to be a potential Internet architecture for Internet-of-Things (IoT) networks due to minimal retrieval delays and reduced load on the data producer. Content retrieval from IoT nodes plays a prominent role in enhancing the performance of the ICN-IoT networks. State-of-the-art cache retrieval mainly focuses on the noncooperative caching mechanism, which is not efficient for constrained IoT networks. As an improvement, this article proposes a cooperative caching scheme for information-centric-IoT networks to optimize the cache hit with the support of a caching network topology model, a content popularity model, and an exogenous request access model. Simulations results reveal that the proposed cooperative caching scheme improves the cache hit rate, resource utilization, and reduce the network delay in IoT networks.
Spectrum Handoff Aware AODV Routing Protocol for Cognitive Radio Vehicular Ad Hoc Networks
Dr Satish Anamalamudi, Abdur Rashid Sangi., Mohammed S Alkatheiri., Mohammed A Alqarni., Muhammad Hammad Memon., Wanan Yang
Source Title: Complexity, Quartile: Q1, DOI Link
View abstract ⏷
End-to-end application performance and throughput of vehicular cognitive transport control protocol depend on how efficiently the segments (TCP header) are being transmitted from source to destination. One way to enhance the performance of vehicular cognitive TCP protocol is by reducing the packet drops in between the source and destination. In general, packet drops occur in between the source and destination of Cognitive Radio Vehicular Ad Hoc Network (CR-VANET) because of spectrum handoff or cognitive node handoff, or network congestion. In this paper, we focus on enhancing the performance of cognitive TCP protocol through "cognitive AODV routing protocol with the spectrum handoff mechanism."In the proposed work, channel-route control messages of cognitive AODV routing protocol are updated with the support of spectrum handoff which helps to provide the backup opportunistic channel during PU active and helps to reduce the end-to-end spectrum handoff packet drops. Simulation results reveal that the overall performance of the vehicular cognitive TCP protocol with the proposed spectrum handoff aware cognitive AODV routing protocol is enhanced as compared to the existing cognitive TCP protocol.
Sentiment Analysis on Zomato Reviews
Source Title: 2021 13th International Conference on Computational Intelligence and Communication Networks (CICN), DOI Link
View abstract ⏷
The impact of online reviews on restaurants has reached to unprecedented level where vast number of people are checking posted opinions/reviews prior to ordering their food deliveries. The two main concepts used in the online reviews are sentiment analysis and exploratory data analysis (EDA). The goal of sentimental analysis is to determine whether the given data is positive, negative or neutral. It can help brands to determine how their product is perceived by their clientele. Sentiment analysis, otherwise known as opinion mining, works thanks to natural language processing and machine learning algorithms, to automatically determine the emotional tone behind online conversations. Sentiment analysis mainly relies on the keywords. The overall analysis is made on the data that has been reviewed on Zomato. Most restaurants available on the applications are established ones, hence we get a good idea regarding the restaurants of Hyderabad. Exploratory data analysis (EDA) is a term for certain kinds of initial analysis and findings done with data sets, usually early in an analytical process.
An Empirical Study on Impact of News Articles
Source Title: 2021 13th International Conference on Computational Intelligence and Communication Networks (CICN), DOI Link
View abstract ⏷
One of the major factors that an author thinks while publishing an article is about getting high impact on the article. Impact of an article is wide and this makes the influence for making challenges to get new ideas and development. An author by knowing the impact of an article can increase the visibility and enhances the influence of published research. It improves the quality and standard of the article. Sometimes citation count can also lead to the impact of an article. Citation count refers to the number of citations established by an article. This research deals with the aim that how to increase the impact of the article to get more citations. Experimental results clearly shows that how the article visibility and the citations can be increased with different performance metrics.
Application of Steganography Imaging by AES and Random Bit
Source Title: 2021 13th International Conference on Computational Intelligence and Communication Networks (CICN), DOI Link
View abstract ⏷
The goal of steganography is to hide the data in another medium, meaning disguising the data, so that the existence of the messages can be concealed. Steganography can be applied to many formats of data, including audio, video, and images and can hide any kind of digital information through data hiding techniques. In this work, we propose an application of steganography imaging that would ensure the secure transfer of data along with integrity and confidentiality because steganography relies on hiding messages in unsuspected multimedia data. In this paper, we providing a steganography imaging application which is based on the Advanced Encryption Standard (AES) and random bit technique.
Decentralized Cloud Storage using Unutilized Storage in PC
Source Title: 2021 12th International Conference on Computing Communication and Networking Technologies, ICCCNT 2021, DOI Link
View abstract ⏷
Cloud Storage is emerging tremendously. Cloud Storage service is provided as an Infrastructure as a Service (IaaS). We can access the data stored in the cloud whenever and wherever we want it. Cloud Storage providers require high resources to maintain the storage servers in their data centres. Nowadays in most of our personal computers (PCs), we find a lot of storage space left unused and we have good internet connectivity. So, our idea is to use our PCs as storage servers by storing the data in the unutilized storage space. We also focus on the security of the data. So, along with some practices followed by the existing cloud storage services, we implement some additional methods to protect the data in our decentralized cloud storage model.
Physical Unclonable Function (PUF)-Based Security in Internet of Things (IoT): Key Challenges and Solutions
Dr Satish Anamalamudi, Mohammed Saeed Alkatheiri., Abdur Rashid Sangi
Source Title: Handbook of Computer Networks and Cyber Security, DOI Link
View abstract ⏷
Security protocols play a pivotal role in transmitting the sensitive application data through packet switched and circuit switched data communication. State-of-the-art research comes up with the constrained IoT design to provide the connectivity in between things without any human intervention. Hence, IoT becomes a promising solution to provide the end-to-end connectivity through constrained network resources. Physical Unclonable Function (PUF) is a digital logic design that is incorporated in Integrated Circuit (IC). It is lightweight, unclonable, and simple to implement. Security mechanisms based on PUF can be an efficient way to provide security for resource-constrained IoT networks. This chapter describes different security aspects/scenarios of IoT that can use PUF-based mechanisms.
Secure opinion sharing for reputation-based systems in mobile ad hoc networks
Dr Satish Anamalamudi, Abdur Rashid Sangi., Jianwei Liu., Mohammed S Alkatheiri
Source Title: Measurement and Control, Quartile: Q2, DOI Link
View abstract ⏷
Due to the basic nature of mobile ad hoc networks, that is, infrastructure-less, it is prone to individual or collective misbehaviors by participating node(s). Participating nodes could act selfishly and does cause massive loss to network performance because of limited resources or belonging to a different administrative domain. Reputation-based solutions are widely used to mitigate selfishness. These solutions are to some extent depend on the feedback from participating nodes for any given node which required its secure exchange in an adverse environment. This paper introduces a secure opinion sharing based on network coding to ensure the effectiveness of any reputation against selfishness in an adverse environment. The proposed scheme addresses the threat to opinion exchange in any reputation-based solution with minor changes. In addition, it can be used to exchange secure data in an adverse environment, for example, virtual currency and feedback exchange for credit payment and game theorybased solutions, respectively. Simulation results proved that this scheme achieves excellent opinion exchange ratio, moderate delay, and affordable per cycle overhead.
Cognitive AODV routing protocol with novel channel-route failure detection
Source Title: Multimedia Tools and Applications, Quartile: Q1, DOI Link
View abstract ⏷
Performance of routing protocol at network layer in Cognitive Radio Adhoc Networks (CRAHNs) is mainly based on the probability of channel availability for application data transmission. To attain, end-to-end channel-route control messages should be disseminated in an efficient mechanism with minimal channel-route re-connection delays. In CRAHNs, end-to-end channel-route failures can be mainly due to spectrum handoff (dynamic primary user intervention), selfish node activity, CR node handoff and bandwidth degradation. In order to enhance the application throughput, it is pivotal to determine the exact channel-route failure and provide alternate end-to-end channel-route path. To achieve, this paper proposes a channel-route failure based Cognitive-AODV routing protocol with the modifications in channel-route-error (channel-RERR) to detect the exact channel-route failure and provide the best alternate end-to-end channel-route path in between source and destination. Experimental results reveal that the performance of proposed Cognitive AODV routing protocol with selfish node activity, spectrum handoff and node handover is outperformed when compared with the existing Cognitive AODV routing protocols.