A Proposed Ensemble Based Model to Classify the Personality Using Big-Five Model
Authors: Mrs. Alpana Meena, Dr. Neelesh Jain
Abstract: Text is categorized using machine learning algorithms based on the sentiment polarity of the text. These machine learning-based approaches approach sentiment classification as a problem akin to document or subject classification. Nevertheless, this method has drawbacks including hard feature extraction, dimensionality expansion, and sparse feature vectors. In this article, we have combined machine learning and deep learning algorithms and proposed ensemble based framework to classify the personality. We have done different experiments with machine learning, deep learning and proposed model and found that our proposed model provided significant average accuracy about 67.452.
DOI: https://doi.org/10.61137/ijsret.vol.11.issue4.102
Early Detection Of Stroke Risk Factors Using Machine Learning Models
Authors: Vinodh kumar S, Pravin P, Siva Sankaralingam G
Abstract:The occurrence of strokes through a model and prediction helps to find out the utilizing data on demographic aspects, lifestyle, and other parameters of a fairly well-known and public healthcare dataset containing information about age, gender, hypertension, heart disease, smoking, BMI, and average glucose level. Preprocessing of the data involved dealing with missing values, encoding the categorical variables, and balancing the data through the Synthetic Minority Oversampling Tech- nique (SMOTE) to overcome the problem of imbalance. Then, several algorithms were trained and tested, including Logistic Regression, Decision Trees, Random Forest, SVM, and Naive Bayes, and the performance was evaluated in terms of accuracy, precision, recall, and F1 score. The results show that ensemble and tree-based algorithms obtain high precision with more than 90% accuracy in predicting who could have a main risk for stroke. Onset feature importance shows age, hypertension, heart disease, and glucose level as important predictors. The gists from the results show that potential exists in machine-learning methods for early risk assessment of stroke and strongly support the implementation of data-driven tools in clinical decision-making to provide timely intervention to reduce stroke morbidity and mortality.
DOI: https://doi.org/10.5281/zenodo.15833732
The Impact of Video Games on Mental Health: Future Benefits and Risks
Authors: Dnyneshwar Gawade
Abstract: Video games have evolved from simple enter- tainment to complex digital environments with significant implications for mental health. This paper explores the dual nature of video games, highlighting their potential benefits, such as cognitive enhancement, stress relief, and social con- nectivity, alongside risks like addiction, social isolation, and increased aggression. Drawing on recent studies, we examine how game genres, play duration, and player demographics influence mental health outcomes. We also discuss future directions, including the development of therapeutic games and the need for nuanced research to inform policy and game design. The findings suggest that moderate gaming can foster mental well-being, while excessive use poses significant risks, necessitating tailored interventions.
DOI: https://doi.org/10.5281/zenodo.15836400
Secure Edge Data Deduplication Under Uncertainties Using Aes-Based Robust Optimization
Authors: Dr. M.M. Janeela Therasa, Gayathri CL
Abstract: Mobile Edge Computing (MEC) enables low-latency data processing by bringing computation closer to users. However, it faces critical challenges such as limited storage capacity, unpredictable data patterns, network instability, and increasing volumes of duplicate data. These factors lead to performance degradation, increased retrieval latency, and inefficient resource utilization. To tackle these issues, a robust optimization-based deduplication framework is proposed. This system leverages two algorithms: UEDDE-C (Uncertainty-based Edge Data Deduplication with Column and Constraint Generation), which provides high accuracy in detecting exact duplicates, and UEDDE-A (Approximation-based), which is computationally lightweight and effective for identifying near duplicates. Furthermore, since data security is a pressing concern in edge environments, the proposed system integrates AES encryption to safeguard sensitive information before the deduplication process. This ensures not only confidentiality and integrity but also standardizes data into consistent formats for more efficient handling. The proposed framework significantly reduces redundant storage, lowers network traffic, speeds up data access, ensures security compliance, and enhances overall MEC system reliability under uncertain and dynamic conditions.
DOI: https://doi.org/10.5281/zenodo.15834266
/
Experimental Analysis on Hybrid Technique for Traffic Flow Prediction with Missing Traffic Data
Authors: B.Dileep, Assistant Professor Mudigonda Harish Kumar
Abstract: This paper proposes hybrid traffic flow prediction techniques using Parametrical Doped Learning (PDL) and Truncated Dual Flow Optimization (TDFO) along with Adaptive Wildfire Optimization (AWO) and Spatial Pattern Super Learning (SPSL). These techniques are validated using datasets from LTPP and PeMS. Performance comparison with traditional algorithms like TrAdaBoost, KLT, and others shows superior outcomes in terms of accuracy, F1-score, sensitivity, and recall.
Analysis of Transport Network and Regional Development the State of the Art
Authors: M. Ramesh Reddy, Assistant Professor K.Abhiram
Abstract: Here we offer a study that falls under the umbrella of regional studies. While ideas like regional development and resources—with a focus on transportation infrastructure—have been around for a while, regional science is a more recent subfield of economics. Nearly everyone is more worried about regional economic planning now than they were a decade ago. A plethora of new concerns and challenges have recently come to light, stemming from the experiences of both industrialized and poor nations. The desirability of various sites for different activities has altered due to the integration of economic activity. Both regional planning on an individual level and coordinated regional planning at the system level have recently garnered increased attention from the governments of a number of nations. A big problem for regional economists has been the increased need for comprehensive information brought about by this interest in economic planning. A nation or area needed a specific amount of transportation infrastructure to make the most of its resources at any particular point in its economic growth. Adami (1987) 1. The effectiveness of the transportation network is a key component of the regional economic structure. In order to reach both local and foreign markets, a country’s production and distribution system relies on its transportation network, particularly its road network. This network must be both adequate and efficient. In addition to the particular function it provides, the transport network is important because of the integrating and uniting effect it has on society and the economy.
Analysis and Investigation towards Performance Enhancement of Traffic Light Control System
Authors: K.Ganesh, Assistant Professor K.Abhiram
Abstract: With these new tools, we can only speculate as to the level of intelligence that the next generation will achieve by modifying current methods. More and more, society benefits from increasingly sophisticated computing techniques. This article explores and discusses the smart road traffic system, which is one of the advantages. Using these state-of-the-art computational approaches will undoubtedly make the control system for road traffic lights easier and more efficient in the future. In order to optimize the overall trip time, the road traffic system takes into account several parameters, such as total travel time, wait time or delay time in the traffic line, vehicle speed, etc. There are a few issues with the road traffic flow, such as drivers not obeying the regulations of the road, accidents, and congestion. The real issue with traffic congestion is the lack of a better model for managing traffic on roads, which necessitates the dedication of researchers to finding a solution. The current traffic light control system has a few limitations, such as not being able to adjust the delay time based on the density of traffic, not being able to prioritize lane flow when emergency vehicles like fire trucks and ambulances are on the road, and so on. These studies aim to find solutions to the aforementioned traffic issues and, using the aforementioned factors, to optimize the overall travel time in a way that causes minimal delay or congestion. To address these issues, this study employs a probabilistic traffic queue analytical model to find a computationally efficient traffic model. An improved traffic light management system is developed by simulating a model that takes vehicle categorization into account and tests it with various traffic conditions. The model accounts for a variable time delay.
Cfd Simulation of an Industrial Dust Cyclone Separator on The Influence of Inlet Velocity and Particle Size Performance
Authors: Dyaga Suresh
Abstract: – Cyclone separators are widely used in industrial applications to remove particulate matter from gaseous streams. The performance of a cyclone is highly dependent on operational parameters such as inlet velocity and particle size. In this study, Computational Fluid Dynamics (CFD) is utilized to simulate and evaluate the performance of an industrial cyclone separator. Various inlet velocities (10 m/s to 25 m/s) and particle diameters (1 µm to 50 µm) were analyzed to determine their effects on separation efficiency and pressure drop. The simulations reveal that both inlet velocity and particle size significantly influence cyclone performance. This work helps optimize cyclone design parameters to improve dust removal efficiency while minimizing energy losses.
Analyzing Music Streaming Patterns Through Business Intelligence: A Power BI-Based Study Of Spotify Data
Authors: Dr. Suhas Mache, Bhagyashree Ingale
Abstract: Globally, music streaming services have completely changed how people listen to music. This study uses Power BI- based data analysis to investigate Spotify user behavior and listening habits. Using dynamic dashboards and visual aids, the study looks at regional preferences, artist performance, track patterns, and genre popularity. With the goal of assisting artists, marketers, and producers in making defensible decisions based on streaming data, business intelligence approaches provide actionable insights.
DOI: https://doi.org/10.5281/zenodo.15836070
Analysis on Knowledge Based Prediction System for Road Traffic Congestion
Authors: S.Likhitha Priya, Assistant Professor B.Narsimha
Abstract: The transportation networks, economy, and quality of life in cities throughout the world are all negatively impacted by the pervasive problem of traffic congestion. Congestion relief is of critical importance in India since the country’s fast urbanization and population increase are making the situation worse. By integrating user and expert perspectives and using a data-driven approach to prepare a knowledge base for traffic congestion, this thesis presents a holistic strategy to address road traffic congestion. The system is tailored to the lndian scenario in the context of inter-urban highways. One module of this thesis is based on user survey methodology, while the other is centered around applying deep learning techniques to analyze the dynamics of congestion. Several facets of congestion prediction are covered in this thesis’s objectives. Through observation and feedback, it seeks to understand the causes that contribute to congestion and identify its characteristics from the perspective of travelers. To achieve this goal, we analyzed data from thirteen separate sites along the undivided interurban highway that were selected based on their mixed land use pattern and where congestion is an issue. The survey questionnaire yielded 206 responses, which constitute the data set. The Mann Whitney U test was used to see if there is any heterogeneity between the two traveler groups, drivers and passengers. This goal assesses the features of congestion using the Relative Impoftancc Index (RII) and also scores their preference for measures to alleviate it, in addition to identifying the attributes of congestion. Additionally, it suggests legislative recommendations based on the opinions of travelers in an effort to enhance their traveling experience on interurban highways. The second aim of the thesis uses exploratory factor analysis on the characteristics derived from the first objective and empirical knowledge to investigate the interrelationships among the congestion factors. This objectivc has sampled data from 282 replies and 5 experts. The professionals at Thcsc come from a variety of backgrounds in the transportation industry. After conducting an exploratory factual analysis on the traffic congestion attributes identified in objective one using interpretive knowledge based on the experiments, the fiuzy based total interpretive structural modeling technique (Fwzy TISM) was used to model the interrelationships between the factors. This technique provides both direct and indirect links of cause and effect between the factors. Micmac analysis has been used to determine the independent, dependent, linkage, and autonomous components in the connection diagram that was derived via Fuzzy TISM. Eight important factors contributing to traffic jams on interurban highways were identified through exploratory factors analysis. These factors include inadequate road geometry (Cl), environmental factors (C2), external events (C3), inefficient public transportation (C5), driving behavior (C6), special events (C7), and regional economic dynamics (C8). Using the r- uzzy TISM technique, we may partition the variables into different levels of the hierarchical connection diagram. At the Level I factor level, we have C3, C4, and C6. The second Lcvel component is C5, the third is Cl and CT, and the fourth is C2 and C8. Based on the results of the fuzzy Micmac analysis, we can deduce that factors such as Regional Economic Dynamics (C8), Special Events (C7), and Environmental Factors (C2) are independent variables that impact most of the factors, despite having less driving power. The occurrence of these variables also affects the linkage factor. In this study, archaic traffic management (C4) is a linkage factor that has a high driving power and a high dependence power owing to its unstable proper. It impacts and is affected by other factors such as road geometry dynamics (C3), logistical events (C4), inefficient public transportation facilities (C5), and driving behavior (C6). This thesis’s first module is complete with the first two objectives. Objectives three and four are located in the second root of the thcsis plant. The third objective of the project is to automate the process of traffic volume determination utilizing modern dccp lcarning dctcctors and trackers. This would enable the collecting of real-time data for congestion prediction. This objective has been tested on a bespoke dataset consisting of 10,000 virtual reality images from India. The photos in this dataset depict a variety of highway situations with several automobiles. A model for vehicle categorization and traffic volume counting has been developed. It has been trained using YOI-O variants, a single stage object identification methodology, and Byte trackcr, a counting method. The results of the traffic volume counter are examined on the intersection scenes and mid blocks of the intcr urban highways on videos ranging from one minute to five minutes. The short videos achieve an accuracy of 98 degrees of freedom, while the long videos reach an accuracy of up to 87 degrees of freedom. In order to build a comprehensive knowledge base for predicting congestion states, the fourth objective of the study suggests a framework for a knowledge base prediction system regarding road congestion. This framework integrates the results of objectives one and three by combining the impedance factors identified in objective one with traffic flow variables obtained from objective three. The 84-hour data sample includes both weekdays and weekends, and this target has been executed on that basis. The ANFIS (Adaptive Network based fuzzy inferencc system) technique, which is based on several membership functions, has been used to analyze and generate the if-then rules for decision makers. It demonstrates that the existence of rnultiple impedance factors is responsible for the differences in congestion levels on weekdays and weekends. Finally, to improve the efficacy and precision of congestion prediction, the proposed knowledge-based basc road traffic system makes use of both uscr-drivcn insights and cxpert knowlcdgc. By combining user feedback with expert analysis and automated data collection techniques, the system provides a comprehensive view of traffic congestion and how to predict it. In the Indian context, where varied traffic situations and intricate urban landscapes call for precise prediction systems, this comprehensive methodology is of utmost importance. The system is able to produce strong and trustworthy predictions because it uses a combination of fuzzy logic and deep learning techniques, which allow it to deal with the inherent uncertainty and complexity of real-world traffic scenarios. Additionally, the system may enhance its predictive capabilities through the incorporation of user feedback and cxpcrt insights, guaranteeing its relevance and efficacy in ever-changing traffic scenarios. Finally, as a solution to the widespread issue of traffic congestion in India, the proposed knowledge-based road traffic congestion prediction system brings hope. The system offers a comprehensive framework for understanding, analysing, and forecasting congestion lcvcls by combining USCr perspectives, cxpcrt knowledge, and sophisticated technology. Improving traffic management strategies, reducing congestion, and making urban transportation systems in India more efficient might all be possible with the use of this system.
Analysis on Modelling of Traffic Pollution to Biomonitor Traffic Police
Authors: MD.Afroz, Assistant Professor B.Narsimha
Abstract: In Indian cities, air pollution is a major problem that affects both the environment and people’s health. The traffic, congestion, and pollution levels of Mumbai, one of the most densely populated metropolises in the world, have been steadily rising in recent years. When it comes to city traffic, the top 20 locations are well-known for their extreme congestion. These areas are known as “hotspots” because of the large concentration of vehicles at certain intersections around the city. At the morning’s peak hour (from 7:30 to 9:00 am) and the evening’s peak hour (from 6:30 to 7:30 pm), cars travel at an average speed of 10 kmph and 8 kmph, respectively. Records show that evenings are the most polluting and crowded times of day. Congestion in the roadways exposes the general population to air and noise pollution. Central Pollution Control Board- specified noise levels are exceeded at the intersections, according to recorded data. The noise levels at the intersections are over 90 dB on average, even though they should be below 75 dB for industrial and 65 dB for commercial settings. 55 decibels for homes and 50 decibels for quiet zones. Vehicle emissions include a wide range of contaminants, including suspended particle matter (SPM), oxides of sulfur (SOx), PAN, and nitrogen oxides (NOx). Any illness affecting the respiratory system is caused by SPM. The research here takes RSPM 2.5 (respirable suspended particulate matter) into account. At 20 different hotspots, we assessed noise levels, RSPM 2.5, number of cars, and congestion at different times of the day and season. The aforementioned parameters are averaged and their standard deviations are computed. In addition, CalRoads View Software is used for data analysis and modeling. Predicting pollution concentrations near intersections where cars idle in waiting for signals is the job of CALINE 3. This program was useful for figuring out how pollutants were distributed at the intersections. Congestion areas and intersections were the sites of the surveys. The poll was conducted via a questionnaire and was open to retailers, street vendors, and traffic police. Breathing and respiratory issues affected the majority of those who participated in the poll. Many of them have had asthma diagnoses at some point. The exposed individuals showed signs of both acute and chronic illnesses. Lots of people who took the poll reported having an illness of the respiratory system, such as sinusitis, tonsillitis, bronchitis, pneumonia, etc. Researchers believe that traffic officers are the best demographic to investigate because of their heightened sensitivity to the effects of vehicle emissions on human systems, particularly the respiratory system and hearing impairments. Participants in the study will be traffic officials tasked with administrative tasks, who will serve as the control group, while traffic officials stationed at different intersections will serve as the exposed group. Individuals working in traffic were tested for noise-induced hearing loss (NIHL). The results clearly show that traffic cops have a greater risk of hearing loss. Unbelievably, 57.14% of traffic cops who have been on the job for more than 25 years have had serious injuries from NIHL. The current investigation did not find any cases of genetic or familial deafness or ear sickness among the participants. The prevalence of asthma was also assessed in other traffic staff. Asthma prevalence among traffic workers can be diagnosed using spirometry analysis. To find the maximum expiratory flow rate in liters per minute, one uses the MINISPIR S/N TO 1795. Asthma diagnosis incidence among traffic police officers aged 18–58 years old (FVC). The ages of the diagnosed traffic professionals range from zero to five, with exposure years in the upper limit. The PEFR is measured in both the exposed and control groups of traffic cops. Approximately eight hours of each workday expose traffic workers directly to particulate matter. Diagnosis ranges from less than five years to over twenty-five years of experience for traffic police. Rejection from the analysis was based on the fact that the traffic police were actively pursuing a smoking habit. The exposed and control groups were both tested for diseases. More than 35% more people in this group have asthma than in the control group. According to the data, there is also an increase in the number of cases of asthma. The effects of air pollution on traffic workers can be mitigated by the development and testing of protective gear. Surgical masks and handkerchiefs are examples of compensatory activities people take in response to pollution threats in order to lower their actual exposures. The research set out to develop a study mask capable of preventing exposure to potentially harmful particles such as dust, allergies, and infectious aerosols. Gradually, further features are included to minimize noise and dust particles that might potentially harm the eyes. We created and tested stage-wise masks. Nose piece with inhaler, exhaust fan near the section of the mouth or nose, lightweight, simple to wear, adaptable to all face sizes and shapes, low maintenance, aesthetically pleasing, and aesthetically pleasing are all features and components that were combined. To determine how effective the study mask was, it was compared to other masks on the market and put through its paces. But how well current face masks protect is anybody’s guess. The purpose was to find out how well face masks protect the respiratory system against RSPM levels that might be hazardous. After subtracting the SPM particles from the total air volume inhaled in one minute, the readings on the muslin fabric are reported. Additional issues caused by RSPM are mitigated to a certain degree. The average number of particles collected in one minute is greater than 0.1 gm/min. Not only does the mask shield you from harmful RSPM levels and noise, but it also has an exhaust fan that draws carbon dioxide out of your mouth and keeps you well-ventilated. With this mask, you won’t have to worry about the usual issues of sweating or itching while wearing it. Stages 1 and 2 of the mask’s modification process were carried out utilizing 3D printer technology. The mask underwent both quantitative and qualitative testing in accordance with OSHA regulations, and test methods were prepared for submission to the ethics committee. All things considered, the outcomes are satisfactory. Additional improvements in terms of size variations, compatibility with additional safety gear like helmets, etc., will allow the device to be released to the public for usage.
Performance Analysis of Two Potential Indus Waste Materials Fa & Lsludge
Authors: B.Lavanya, Assistant Professor M.Harish Kumar
Abstract: This study stabilized two potentially hazardous industrial waste materials—Fly Ash (FA) and Lime Sludge (LS)—using Commercial Lime (CL) and Gypsum (G), respectively. The FA and LS come in large quantities from thermal power plants and water treatment plants, and they pose environmental hazards. The goal was to make the sludge and FA suitable for use in Civil Engineering construction applications. Tests for Unconfined Compressive Strength (UCS), California Bearing Ratio (CBR), and Split Tensile Strength Test (STS) were conducted on 39 different combinations of FA, LS, CL, and G. Using these metrics, we were able to determine that two composites stabilized with 12%CL and 1%G performed admirably: (optimum mix 2, 50%FA+50%LS) and (optimum mix 1,95%FA+5%LS). After 28 days of curing, the UCS for optimal mix 1 was 6.6 MPa, while for optimum mix 2 it was 5.8 MPa, and for STS it was 1.3 MPa, and for both mixes it was 1.1 MPa. After 28 days of curing, the optimal mix 1 had a soaked CBR value of 75% and an unsoaked value of 89%. When optimal mix 2 was cured for 28 days, the CBR values were 91% for the unsoaked condition and 82% for the soaked condition. When tested for durability according to both British and American standards, both composites passed with flying colors. For instance, following twelve cycles of wetting and drying, mix 2 samples lost 1.05% of their initial weight while mix 1 samples only lost 1.12% of their initial weight, both of which were well within the criteria that were recommended. Also, after 28 days of curing, the composites had almost little heavy metal content leaching out. Additionally, scanning electron microscopy (SEM) and X-ray diffraction (XRD) were used to study the development of cementitious compounds during curing. Having said that, the stabilized composites exhibited brittleness. The ductility and strength were examined after adding fibers to these composites in an effort to reduce their brittleness. The results of the trials with the various fiber percentages showed that adding 0.3% of fiber to both composites increased the mix’s strength and ductility. The strength index and deformability index were used to measure the improvement in ductility and strength, respectively. There is an 80% increase in failure strain and a 40% increase in strength when fibers are added. The findings indicated that using Composites have achieved the necessary strength, durability, and ductility to be used as base course layer materials in flexible pavements. Furthermore, it is recommended that the total pavement thickness be decided upon based on dependability, taking into account the uncertainty in the input data such as the design traffic load (measured in Million Standard Axle, MSA) and soil carrying capacity (measured in CBR value). In order to determine the total thickness of the flexible pavement that can guarantee a specific degree of dependability in the pavement’s performance, design charts were produced, taking into account the uncertainty in the input data. In addition, the reliability-based technique—which combines the Mechanistic- Empirical approach with Monte Carlo Simulation (MCS) and the First Order Reliability Method—was used to study the uncertainties in distress analysis of flexible pavement, specifically with regard to fatigue and rutting failure. In order to determine the pavement’s performance, the reliability index (β) was calculated and strain at crucial sites for fatigue and rutting failure was calculated using PLAXIS 2D coding software on a three-layered flexible pavement model. The specified strength, durability, and ductility requirements are met by the produced composites, namely Optimal Mix 1 [(95%FA+5%LS), 12% CL, 1% G and 0.3% F] and Optimal Mix 2 [(50%FA+50%LS), 12% CL, 1% G and 0.3% F]. The compressive strain value at the crucial point was found to be ɛc = 13.42 x 10-5 and the tensile strain value at the same region was found to be ɛt = 18.11 x 10-5 for a 450 mm thick foundation of optimal mix 2. For 10 MSA of design traffic load, the pavement performed admirably (βR and βF >5) according to the probabilistic based methodology. However, when subjected to larger traffic loads, the pavement exhibited above-average performance in terms of fatigue and rutting.
Design of Flexible Pavement with Maximum Utilization of Industrial Waste
Authors: G.Srikanth, Assistant Professor M.Harish Kumar
Abstract: Since the subgrade is the bedrock upon which the pavement model is built, its behaviour is crucial to the design of flexible road projects. Therefore, pavement performance and design analysis need a considerable deal of attention. The expansive soils include a high concentration of clay and silt, hence it is necessary to stabilise or compact the soil subgrade prior to building a flexible pavement. Subgrade soil replacement is the method of choice for stabilising certain types of soils. Fly ash, an industrial waste product, is currently being used in around 55% of its total capacity. Substituting bitumen-coated chicken mesh with fly ash in subgrade at varying percentages and layering the mesh appropriately will make good use of the fly ash. A better California bearing ratio can be achieved by using bitumen-coated chicken mesh as extra reinforcement. This study utilises the CBR mould to impact fly ash samples till they reach their maximum dry density and optimal moisture content. The samples are then used with or without bitumen coated bamboo meshes and chicken mesh layers. Dimensions of the chicken mesh match those of the CBR mould in plan view. It was thereafter covered with chicken mesh after being placed in the preparations of 1, 2, 3, and 4 layers. The laboratory tests show that the mixture with four layers of bitumen-coated chicken mesh and a 15% substitution of fly ash has the highest CBR value, indicating that it is very strong. The soil’s moisture content is lowered by using plastic garbage as a partial replacement. In this study, various percentages of plastic garbage are introduced to soil in order to quantify and compare the values of UCS, CBR, MDD, and OMC with the unmodified soil.
Enhancing Student Performance PredictIon Using Random Forest And Feature Engineering Algorithms In Machine Learning
Authors: Ms.T. Nandhini Supervisor Assistant Professor, M.Mugesh Kumar, M.Snekan, R.Tamilselvan
Abstract: This project presents a machine learning-based methodology for student performance prediction with Random Forest and Feature Engineering. Academic institutions are becoming more dependent on data-driven intelligence to enhance educational planning and student support. Conventional models usually do not capture various student characteristics like demographic, academic, and behavioral features, thus restricting predictive capabilities. In this paper, we overcome these limitations by suggesting an ensemble learning approach with sophisticated feature engineering to enhance the interpretability and flexibility of the prediction process. The Random Forest classifier is employed due to its high accuracy and stability, and the model is assessed using metrics like accuracy, precision, recall, and F1-score. Experimental results indicate that the new system surpasses conventional AFSA-based models in detecting at-risk students, allowing for early intervention approaches to improve academic performance.
Performance Analysis of Two Potential Indus Waste Materials Fa & Lsludge
Authors: B.Lavanya, Assistant Professor M.Harish Kumar
Abstract: This study stabilized two potentially hazardous industrial waste materials—Fly Ash (FA) and Lime Sludge (LS)—using Commercial Lime (CL) and Gypsum (G), respectively. The FA and LS come in large quantities from thermal power plants and water treatment plants, and they pose environmental hazards. The goal was to make the sludge and FA suitable for use in Civil Engineering construction applications. Tests for Unconfined Compressive Strength (UCS), California Bearing Ratio (CBR), and Split Tensile Strength Test (STS) were conducted on 39 different combinations of FA, LS, CL, and G. Using these metrics, we were able to determine that two composites stabilized with 12%CL and 1%G performed admirably: (optimum mix 2, 50%FA+50%LS) and (optimum mix 1,95%FA+5%LS). After 28 days of curing, the UCS for optimal mix 1 was 6.6 MPa, while for optimum mix 2 it was 5.8 MPa, and for STS it was 1.3 MPa, and for both mixes it was 1.1 MPa. After 28 days of curing, the optimal mix 1 had a soaked CBR value of 75% and an unsoaked value of 89%. When optimal mix 2 was cured for 28 days, the CBR values were 91% for the unsoaked condition and 82% for the soaked condition. When tested for durability according to both British and American standards, both composites passed with flying colors. For instance, following twelve cycles of wetting and drying, mix 2 samples lost 1.05% of their initial weight while mix 1 samples only lost 1.12% of their initial weight, both of which were well within the criteria that were recommended. Also, after 28 days of curing, the composites had almost little heavy metal content leaching out. Additionally, scanning electron microscopy (SEM) and X-ray diffraction (XRD) were used to study the development of cementitious compounds during curing. Having said that, the stabilized composites exhibited brittleness. The ductility and strength were examined after adding fibers to these composites in an effort to reduce their brittleness. The results of the trials with the various fiber percentages showed that adding 0.3% of fiber to both composites increased the mix’s strength and ductility. The strength index and deformability index were used to measure the improvement in ductility and strength, respectively. There is an 80% increase in failure strain and a 40% increase in strength when fibers are added. The findings indicated that using Composites have achieved the necessary strength, durability, and ductility to be used as base course layer materials in flexible pavements. Furthermore, it is recommended that the total pavement thickness be decided upon based on dependability, taking into account the uncertainty in the input data such as the design traffic load (measured in Million Standard Axle, MSA) and soil carrying capacity (measured in CBR value). In order to determine the total thickness of the flexible pavement that can guarantee a specific degree of dependability in the pavement’s performance, design charts were produced, taking into account the uncertainty in the input data. In addition, the reliability-based technique—which combines the Mechanistic- Empirical approach with Monte Carlo Simulation (MCS) and the First Order Reliability Method—was used to study the uncertainties in distress analysis of flexible pavement, specifically with regard to fatigue and rutting failure. In order to determine the pavement’s performance, the reliability index (β) was calculated and strain at crucial sites for fatigue and rutting failure was calculated using PLAXIS 2D coding software on a three-layered flexible pavement model. The specified strength, durability, and ductility requirements are met by the produced composites, namely Optimal Mix 1 [(95%FA+5%LS), 12% CL, 1% G and 0.3% F] and Optimal Mix 2 [(50%FA+50%LS), 12% CL, 1% G and 0.3% F]. The compressive strain value at the crucial point was found to be ɛc = 13.42 x 10-5 and the tensile strain value at the same region was found to be ɛt = 18.11 x 10-5 for a 450 mm thick foundation of optimal mix 2. For 10 MSA of design traffic load, the pavement performed admirably (βR and βF >5) according to the probabilistic based methodology. However, when subjected to larger traffic loads, the pavement exhibited above-average performance in terms of fatigue and rutting.
Intelligent Detection of Mobile SMS Spam via Machine Learning and Deep Learning
Authors: Megha Birthare, Neelesh Jain
Abstract: The global rise in social media usage has led to a surge in unwanted bulk SMS, necessitating the development of an effective system to filter out these messages. The most prevalent issue on the internet is spam text messages. Sending a spam-filled SMS is a straightforward task for spammers. Spammers are able to take valuable data, including contacts and files, from our devices. In recent years, several word embedding techniques leveraging deep learning have been developed. These advancements in word representation could offer a reliable remedy for these problems. This study will look at a technique that employs natural language processing to distinguish among spam and ham texts utilizing the SMS Spam Collection Dataset from the UCI Machine Learning Repository. We compared the accuracy and outcomes of using the Bi-LSTM and LSTM. The effectiveness of the dataset is assessed using measures like F1-score, recall, and accuracy. The study demonstrates that the dataset’s overall accuracy increases when Bi-LSTM classification is used. Python is used for all work, and a Jupyter notebook is used for implementation.
DOI: https://doi.org/10.61137/ijsret.vol.11.issue4.114
Designing Multi-Speciality Hospitals: Architectural Integration of Healing, Functionality, and Technology in Rural India
Authors: Anant Kumar, Professor Gulfam B. Shaikh, Professor Dilip L. Jade
Abstract: With the rapid urbanization of rural regions in India, the demand for healthcare infrastructure has surged dramatically. This research paper explores the architectural planning and design of a Multi-Speciality Hospital in Sikandarpur, Bihta, Patna (Bihar)—a region currently underserved in medical facilities. Emphasizing healing environments, sustainable strategies, patient-centered design, and technological integration, the paper outlines the necessity, process, and architectural responses to contemporary hospital design. A case study of Tata Medical Centre, Kolkata informs the practical and structural feasibility of the proposed design.
Autopapermine: Research Paper Information Extractor
Authors: Jinta Johnson, Assistant Professor Athira B, Professor Dr. Shine Raj G
Abstract: This paper presents a lightweight and intelligent system for the automatic extraction of structured information from academic research papers in PDF format. The proposed system leverages Natural Language Processing (NLP) techniques, TF-IDF-based summarization, and Sentence-BERT semantic similarity to extract and analyze metadata such as title, authors, organizations, keywords, and references. Built using Python and Streamlit, the tool allows users to upload PDF documents, parse academic content, and interactively review summarized metadata, references, and semantic relevance—all in real-time. This paper details the system architecture, implementation pipeline, challenges, and experimental results, demonstrating its effectiveness and scope for future enhancements
A Word Embedding Approach To Analyzing CEO Earnings Call Transcripts And Stock Market Reactions
Authors: Harsha Sammangi, Aditya Jagatha, Hari Gopal Maddireddy
Abstract: This study presents a sentiment-driven Decision Support System (DSS) that leverages advanced word embedding techniques—Word2Vec, GloVe, and BERT—to analyze CEO earnings call transcripts and predict stock market reactions. Tra- ditional lexicon-based sentiment models fail to capture the nuanced, contextual language used by executives. By employing pre-trained embeddings and machine learning classifiers, the study enhances the accuracy of sentiment classification. The proposed system integrates quantitative sentiment scores with event study method- ology to assess the impact of CEO tone on stock performance. Thematic analysis further enriches interpretability by identifying recurring patterns in executive com- munication. Results demonstrate that positive CEO sentiment generally correlates with stock appreciation, while negative sentiment aligns with declines. Among models tested, BERT outperformed others in classification accuracy. This research contributes to real-time financial analytics by embedding sentiment intelligence into DSS frameworks, supporting investors, analysts, and automated trading sys- tems with improved decision-making capabilities grounded in contextual linguistic analysis.
DOI: https://doi.org/10.5281/zenodo.15845150
Integrating Time Series Forecasting And Business Intelligence: A Power BI Dashboard Approach For Sales Prediction
Authors: Vaishnavi Kane, Assistant Professor Dr. Suhas Mache, Assistant Professor Dr. Arshiya Khan
Abstract: In the modern business environment, accurate sales forecasting is essential for effective decision-making. This paper explores the integration of time series forecasting techniques with Business Intelligence (BI) tools, specifically using Microsoft Power BI, to build an interactive dashboard for sales prediction. We present a model that combines statistical forecasting methods with data visualization to enhance decision-making in sales management. The system is designed to provide real-time insights, support strategic planning, and identify sales trends through dynamic dashboards. Our case study demonstrates that integrating forecasting models within BI platforms significantly improves sales predictability and operational efficiency. This study explores the integration of time series forecasting techniques with Microsoft Power BI to build a dashboard that predicts future sales. By combining forecasting models like ARIMA and Prophet with Power BI’s interactive features, the system enables better business decision-making. The dashboard allows users to visualize historical trends and forecasted sales data in real time.
Seismic Performance Analysis Of A G+11 Building Based On Live Earthquake Load Using ETABS Software
Authors: Keshav Kumar Ahirwar, Asssitant professor Mrs. Ankita singhai, Mr. Rahul Satbhaiya
Abstract: A building needs to be able to withstand considerable ground vibrations during construction or operation in order to be deemed earthquake-resistant. However, ground motions have a particular effect on structure reactions. Time-history analysis is effective for buildings that are subject to large ground vibrations. Stepwise integration of the pushover analysis of a multi-degree-of-freedom system (MDOF) in the time domain is used to illustrate a structure’s response. This method is advantageous even if it takes a lot of time. In order to speed up the design and assessment of seismic structures, the pushover analysis was developed.Pushover study indicates that during seismic events, structures vibrate mostly in the lower or early modes. The multi-DOF system is then reduced to a single-DOF system using the characteristics revealed by its nonlinear static analysis. After that, a response spectrum analysis is performed on the ESDOF system using either nonlinear time history analysis, damped analysis, or constant-ductility analysis. Modal links are used to translate ESDOF seismic needs into MDOF seismic requirements.A model was created to show the overall consequences of the RCC frame building. In seismic zone II, four G+11 concrete planar frames with four bays oriented in X and Y were built in compliance with Indian regulations. There are five loading scenarios for every frame. Pushover analysis evaluates frames with different elevational anomalies but the same loading conditions. Frame after frame, the results are different. Each frame’s capacity spectrum and pushover curve between base shear and displacement are calculated and evaluated. At seismic zone II stress, STAAD was utilized to analyze RCC frame non-linear response. To weight parameters, utilize Pro v8i. Infill walls and bare frames were compared
Privacy-Preserving Collaborative Searchable Encryption Using Blake3 for Cloud-Based Group Data Sharing
Authors: Aatheni U, Dr. M. M. Janeela Theresa
Abstract: – Collaborative searchable encryption for group data sharing enables authorized users to jointly generate trapdoors and retrieve encrypted data without compromising privacy. However, existing solutions remain vulnerable to keyword guessing attacks (KGAs) by malicious insiders and subversion threats such as backdoors from untrusted hardware or software vendors. To overcome these security challenges, we propose a Privacy- Preserving Collaborative Searchable Encryption (PCSE) scheme using the BLAKE3 hash function. PCSE introduces a dedicated keyword server to enable server-derived keywords that resist insider KGAs, and employs cryptographic reverse firewalls to mitigate subversion risks. A distributed, multi-server keyword architecture is adopted to prevent single-point failures. The system also supports multi-keyword search, result verification, and includes a rate-limiting mechanism to restrict brute-force attempts. Formal analysis confirms resistance against KGAs and subversion attacks. Empirical evaluations demonstrate that PCSE achieves strong privacy, scalability, and efficient keyword-based search, making it suitable for secure cloud-based group data sharing
DOI: https://doi.org/10.5281/zenodo.15847524
Blockchain-Based Insurance Claim
Authors: Priyanka Gupta, Hardik Gupta, Anurag Tomar, Piyush Raghav
Abstract: This project aims to revolutionize the insurance claims process by leveraging blockchain technology and smart contracts to address inefficiencies such as fraud, human errors, security risks, and high administrative costs. Traditional insurance claim processing is complex, requiring extensive human intervention, multi-domain interactions, and data from multiple sources, making it time-consuming and labor-intensive. By utilizing a private Ethereum blockchain and the Solidity programming language for smart contract development, this framework automates claim verification and settlement, ensuring transactions occur only if all predefined conditions are met. The integration of the Proof of Authority (PoA) consensus algorithm enhances transaction validation, improving security and transparency throughout the process. Additionally, decentralized applications (DApps) facilitate seamless user interactions, while the InterPlanetary File System (IPFS) enables off-chain data storage to maintain accessibility and immutability without overloading the blockchain network. This decentralized system prioritizes trust, transparency, and scalability, allowing for efficient processing of health insurance claims, particularly for prescription drugs, while significantly reducing operational costs. By combining blockchain’s transparency with scalable off-chain storage, this solution transforms the insurance sector, offering a reliable, secure, and cost-effective approach to claim management.
Novel Approach to Load Analysis of Multistory Building with Its Bending
Authors: Research Scholar Aman Singh Bais, Professor Rajesh Chouhan
Abstract: A multi-storey is a building that has multiple floors above the ground. It can be a residential or commercial building. In this project the analysis and design of multi-storey building. In general, the analysis of multi-storey is elaborate and rigorous because those are statically indeterminate structures. Shears and moments due to different loading conditions are determined by many methods such as portal method, moment distribution method and matrix method. The present project deals with the analysis of a building. The dead load & live loads are applied and the design for beams, columns, the footing is obtained manually.
Enhancement of Load Bearing Capacity in Diagrid Multistory Building with Observed Torsion
Authors: Research Scholar Yawar Khan, Professor Sachin Sironiya
Abstract: The difference between conventional outer brake frame structures and current diagrid structures is that for diagrid structures, almost all conventional vertical columns are removed. Elimination of vertical columns is possible because diagonal elements in diagrid structural systems can carry gravitational loads as well as lateral forces, while diagonals in conventional elastic frame structures carry only lateral loads. The most normal and popular material in the process of building diagrids is steel. The incisions commonly used are rectangular, rounded and wide flanges. The weight and size of the sections are made to withstand high bending loads.
Developing A Multi-Modal Edge-AI Framework For Continuous Infant Monitoring: Predicting Mental Health Outcomes
Authors: Dr. Sanjeev Puri, Sandeep Keshav
Abstract: The evolution of Edge-AI technologies has created new opportunities in pediatric healthcare, allowing for real-time monitoring of infants while maintaining privacy. This research introduces an innovative multi-modal Edge-AI framework that combines video, audio, and physiological data to anticipate potential mental health issues in infants. The proposed system processes information locally on edge devices, minimizing latency, enhancing privacy, and enabling continuous monitoring in both clinical and home settings. By employing lightweight AI models for on-device processing, the system promotes early identification of neurodevelopmental challenges and encourages timely interventions. This approach aims to shift healthcare from a reactive stance to a preventive one, ultimately aiming to foster long-term enhancements in mental health. The paper outlines the system’s architecture, techniques for optimizing AI models, and prospective applications in pediatric healthcare environments.
DOI: https://doi.org/10.5281/zenodo.15867669
Developing A Multi-Modal Edge-AI Framework For Continuous Infant Monitoring: Predicting Mental Health Outcomes
Authors: Dr. Sanjeev Puri, Sandeep Keshav
Abstract: The evolution of Edge-AI technologies has created new opportunities in pediatric healthcare, allowing for real-time monitoring of infants while maintaining privacy. This research introduces an innovative multi-modal Edge-AI framework that combines video, audio, and physiological data to anticipate potential mental health issues in infants. The proposed system processes information locally on edge devices, minimizing latency, enhancing privacy, and enabling continuous monitoring in both clinical and home settings. By employing lightweight AI models for on-device processing, the system promotes early identification of neurodevelopmental challenges and encourages timely interventions. This approach aims to shift healthcare from a reactive stance to a preventive one, ultimately aiming to foster long-term enhancements in mental health. The paper outlines the system’s architecture, techniques for optimizing AI models, and prospective applications in pediatric healthcare environments.
DOI: https://doi.org/10.5281/zenodo.15867669
A Generalized Tipping Condition For Arbitrary Geometric Objects Based On Contact Area And Applied Energy Using Cross Products
Authors: Rupsa Sarkar
Abstract: This work introduces a new energy-based model based on cross product torque analysis for the generalization of the tipping condition of rigid bodies of general shape. Classical mechanics employs torque to find rotational balance, but my method introduces the percent contact area (PCA) to define the extent to which the object is supported on a surface and how this influences tipping. The article presents a formula for computing the minimum amount of external energy to cause tipping by considering torque through the cross product and accounting for geometric distribution and weight. The PyBullet simulations yield high correlation, affirming the model’s ability to make predictions.
DOI: https://doi.org/10.5281/zenodo.15867962
AI-Driven QA In Print Production: Real-Time Monitoring For Zero-Defect Printing
Authors: Amit Sharma
Abstract: As the printing industry transitions into the era of Industry 4.0, traditional quality assurance methods—centered on manual inspection and reactive defect handling—are increasingly inadequate for the speed, complexity, and customization demands of modern pressrooms. This paper explores the transformative potential of Artificial Intelligence (AI) and Machine Learning (ML) in real-time monitoring and quality assurance (QA) across print production workflows. Leveraging technologies such as computer vision, IoT sensor networks, and predictive analytics, AI-enabled systems enable proactive defect detection, automated correction, and dynamic process optimization. Applications include in-line visual inspection, root cause analysis, intelligent alerting, and traceable compliance logging. Case studies demonstrate significant gains in defect reduction, throughput, and client satisfaction. However, adoption remains hindered by challenges such as legacy equipment integration, data infrastructure gaps, workforce readiness, and cybersecurity concerns. Future directions emphasize the role of digital twins, federated learning, cloud-based QA hubs, and sustainability-aware defect prevention. Ultimately, AI transforms quality assurance from a reactive function into a strategic enabler—advancing efficiency, brand protection, and environmental responsibility in next-generation print operations.
DOI: https://doi.org/10.5281/zenodo.15868325
Trends In Adoption And Challenges Of Green Computing In Africa
Authors: Eric Sifuna Siunduh, Victor Mony Otieno
Abstract: Green computing has emerged as a critical strategy for achieving sustainable technological growth globally. In Africa, this approach offers an essential path to mitigating pressing issues such as energy scarcity, poor e-waste management, and environmental degradation. Green computing encompasses environmentally friendly practices in the design, usage, and disposal of information and communication technology (ICT) infrastructure. This paper explores trends, drivers, challenges, and future directions of green computing adoption in Africa, emphasizing the role of data centers, cloud computing, and policy enforcement. A mixed-methods approach was employed, utilizing secondary sources such as journal articles, case studies, and institutional reports published between 2019 and 2024. Key findings reveal that renewable energy integration in ICT operations, policy support, and international collaborations are driving adoption in countries like Kenya, South Africa, and Rwanda. However, persistent barriers—such as limited infrastructure, high initial costs, and a shortage of skilled professionals—continue to hinder widespread implementation. The study recommends coordinated stakeholder action, increased investment in education and renewable energy, and the development of regional green ICT hubs. This paper contributes to understanding how Africa can align digital transformation with sustainability, advancing both environmental goals and socio-economic development.
Cloud Computing And Web 3.0 Technologies For Effective Public Participation: The African Context
Authors: Eric Sifuna Siunduh, Zachary Mwangi, Dr. Alice Nambiro Wechuli,
Abstract: The increasing adoption of cloud computing and Web 3.0 technologies offers transformative potential for public governance in Africa, particularly in enhancing citizen participation. Despite various efforts to digitize public services, many governments still struggle to ensure inclusive, transparent, and interactive participation frameworks. This paper examines how cloud computing and Web 3.0 technologies can be harnessed to empower citizens and strengthen e-participation in the African context. It explores the integration of semantic web, blockchain, and machine learning to facilitate interactive e-governance platforms. By employing an ex post facto research design, the study synthesizes empirical and theoretical insights to develop a model for citizen empowerment. Findings show that cloud-based platforms significantly increase accessibility and engagement, while Web 3.0 tools foster real-time collaboration and personalization. The proposed empowerment model emphasizes decentralization, transparency, and inclusivity. The study concludes with policy recommendations to foster digital literacy, improve infrastructure, and safeguard data governance for sustainable civic engagement.
Fault-Tolerant Software Architecture: A Comprehensive Analysis Of Design Patterns, Implementation Strategies And Performance Evaluation
Authors: Mr. Eric Sifuna Siunduh, Mr. Victor Mony Otieno, Professor Samuel Mbugua
Abstract: Fault-tolerant software architecture has become increasingly critical in modern distributed systems, where system failures can result in significant economic losses and service disruptions. This research paper provides a comprehensive analysis of fault-tolerant software architecture design patterns, implementation strategies, and performance evaluation methodologies. Through a systematic literature review of 45 peer-reviewed articles published between 2020-2024, this study identifies key architectural patterns including redundancy-based approaches, checkpoint-restart mechanisms, and self-healing systems. The methodology employed includes comparative analysis of fault tolerance techniques, performance benchmarking, and case study evaluation of real-world implementations. Data analysis reveals that hybrid approaches combining multiple fault tolerance strategies achieve 99.99% system availability with 15-30% performance overhead. Results demonstrate that micro services architectures with circuit breaker patterns and service mesh implementations provide superior fault isolation compared to monolithic systems. The discussion includes detailed analysis of trade-offs between fault tolerance levels and system performance, supported by empirical data from 12 case studies. Key findings indicate that automated recovery mechanisms reduce mean time to recovery (MTTR) by 65% compared to manual intervention approaches. This research contributes to the field by providing a comprehensive framework for evaluating fault-tolerant architectures and offers practical guidelines for system architects. Future research directions include exploration of AI-driven fault prediction, quantum-resistant fault tolerance mechanisms, and edge computing fault tolerance strategies.
DOI: http://doi.org/10.5281/zenodo.15868925
Advance Construction Techniques In Modern Sports Complexes
Authors: Prof. Ar. Gulfam Shaikh, Vedant Mundiwale, Prof. Ar. Dilip Jade
Abstract: The evolution of sports complexes has undergone a significant transformation over the last few decades, driven by advances in construction technologies, materials, and design philosophies. Modern sports infrastructure demands multifunctionality, sustainability, spectator comfort, and technological integration. This study explores the advanced construction techniques employed in modern sports complexes, including long-span roofing systems, modular construction, precast technologies, tensile membrane structures, and smart building systems. These approaches contribute to faster construction, cost efficiency, structural integrity, and enhanced user experience. Case studies such as Jawaharlal Nehru Stadium, Delhi and Tottenham Hotspur Stadium contextualize these practices. Challenges like cost overruns, design complexity, and sustainability mandates are also examined. The study concludes by emphasizing innovation, interdisciplinary collaboration, and future-ready design strategies for sports architecture.
Evaluating Mechanistic Data Analysis Methods For Machine Learning On Effects Of Climate Change In Africa
Authors: Eric Sifuna Siunduh, Zachary Mwangi, Dr. Anselemo Peters Ikoha
Abstract: Climate change poses unprecedented challenges to African nations, necessitating sophisticated analytical approaches to understand and predict its multifaceted impacts. This study evaluates the effectiveness of mechanistic data analysis methods in machine learning applications for assessing climate change effects across Africa. Through a comprehensive analysis of temperature, precipitation, and socioeconomic data from 2020-2024, the study compared traditional statistical approaches with mechanistic machine learning models including physics-informed neural networks (PINNs), causal inference frameworks, and hybrid mechanistic-statistical models. The methodology integrated satellite data, ground-based observations, and socioeconomic indicators from 54 African countries, employing cross-validation techniques and mechanistic validation approaches. Results demonstrate that mechanistic methods significantly outperform traditional approaches in prediction accuracy (RMSE improved by 23-31%) and interpretability. Physics-informed models showed superior performance in temperature prediction (R² = 0.89) while causal inference frameworks excelled in understanding precipitation-agriculture relationships. The study reveals critical insights into drought patterns, agricultural vulnerability, and urban heat island effects across different African climatic zones. Key findings indicate that mechanistic approaches provide more robust predictions for policy-relevant scenarios, particularly in data-sparse regions common across Africa. However, computational complexity and data requirements present implementation challenges. The study recommends the integration of mechanistic methods with traditional approaches for comprehensive climate impact assessment, emphasizing the need for capacity building and infrastructure development to support widespread adoption of these advanced analytical techniques in African climate research
DOI: http://doi.org/10.5281/zenodo.15869058
Tribal Building Techniques
Authors: Ritesh Vinod Deshmukh
Abstract: Indigenous communities across India have evolved construction systems that respond beautifully to their surroundings, traditions, and material availability. This research explores the tribal building techniques of the Gond community in Maharashtra—techniques shaped by centuries of climate interaction, local materials, and symbolic customs. Through on-site documentation, literature studies, and direct interactions with tribal members, the study uncovers the relevance of these methods in sustainable and context-sensitive architecture. This paper emphasizes how integrating tribal techniques into modern design, like in the proposed Gond Tribal Cultural Centre near Tadoba, can restore ecological balance and cultural identity in rural development.
Aerodynamic Analysis Of Ucav Ghatak Using Cfd
Authors: Saraniyan M, Assistant professor Gowtham G
Abstract: The Unmanned Combat Aerial Vehicle (UCAV) Ghatak represents a pivotal advancement in stealth and autonomous aerial warfare technology. This project focuses on a comprehensive aerodynamic analysis of the Ghatak UCAV using Computational Fluid Dynamics (CFD) tools. By simulating realistic flight conditions, this study evaluates the aerodynamic performance parameters such as lift, drag, pressure distribution, and flow behavior across the stealth-designed airframe. The analysis takes into account various angles of attack and flight regimes to understand the aircraft’s stability and efficiency. The objective is to optimize the aerodynamic characteristics of Ghatak to ensure superior performance, maneuverability, and reduced radar signature. The insights gained from this CFD-based investigation will contribute to enhancing the combat readiness and operational effectiveness of next-generation UCAVs.
Sustainable Strategies In Public Transportation Hubs
Authors: Vivek Sonone, Prof. Malini Nathe, Prof. Radhika Raut, Prof. Saiyam.S. Chaturvedi, Dr. Sudhir V. Dhomane, Dr. P. V. Thorat Principal
Abstract: Public transportation hubs serve as critical nodes in urban mobility networks, significantly influencing environmental, economic, and social sustainability. This research explores sustainable strategies implemented in transportation hubs to enhance energy efficiency, reduce carbon emissions, and promote multimodal integration. The study examines case studies from global cities, highlighting innovative practices such as green infrastructure, renewable energy integration, smart mobility technologies, and inclusive design. It also assesses policy frameworks, stakeholder involvement, and the role of digitalization in optimizing operations. The findings emphasize the importance of systemic planning and cross-sector collaboration to transform transportation hubs into resilient and sustainable urban assets.
Aerodynamic Analysis And Improvising Of Noiseless UAV Propeller Designs Using CFD
Authors: Sridhar A S, Assistant professor Gowtham G
Abstract: Unmanned Aerial Vehicles (UAVs) have become integral to various applications, yet their operational efficiency and noise emission remain critical challenges. This project focuses on the aerodynamic analysis and optimization of noiseless UAV propeller designs using Computational Fluid Dynamics (CFD). By leveraging advanced CFD simulations, we assess the airflow characteristics, thrust generation, and acoustic behavior of different propeller configurations. The study aims to reduce aerodynamic noise while maintaining or enhancing propulsive efficiency. Design modifications, including blade geometry refinement and tip treatments, are evaluated to identify optimal configurations that balance performance with acoustic stealth. The results provide actionable insights into developing quieter UAV systems suitable for sensitive missions and urban environments, where noise reduction is essential. This research contributes to the growing demand for UAVs that are both aerodynamically efficient and environmentally compliant.
The Psychology Of Space In Museum Environment
Authors: Imran Khan, Ar Malini O Nathe, Ar Radhika Raut
Abstract: The spatial design of museums significantly influences how visitors perceive, experience, and emotionally connect with exhibits. This research explores the psychological dimensions of space within museum environments, examining how architectural elements—such as spatial layout, lighting, scale, materiality, circulation, and enclosure—affect cognitive engagement, emotional response, and visitor behavior. Drawing on environmental psychology, spatial theory, and museum studies, this study analyzes how spatial configurations can either enhance or inhibit interpretive experiences and memory retention. Case studies from contemporary and traditional museums are evaluated to understand how spatial strategies support narrative storytelling, accessibility, and user comfort. The findings highlight the critical role of psychologically responsive design in creating immersive and meaningful museum experiences, informing both curatorial strategies and architectural practices.
Performance Assessment and Environmental Benefits of Emulsion-Based Warm Mix Asphalt
Authors: Research Scholor Mr. Arun Kumar Pyasi, Assistant Professor Mr. Hariram Sahu
Abstract: The adoption of Warm Mix Asphalt (WMA) offers a sustainable alternative to conventional Hot Mix Asphalt (HMA), especially in developing countries like India where energy efficiency and cost-effectiveness are critical. This study investigates a simplified method of producing WMA using a medium-setting bitumen emulsion and VG 30 binder for Dense Bituminous Macadam (DBM) mixes. Marshall samples were prepared at three different mixing temperatures—110°C, 120°C, and 130°C—with varying bitumen-to-emulsion (B:E) ratios ranging from 50:50 to 100:0. Marshall Stability, flow value, unit weight, air voids, voids in mineral aggregate (VMA), and voids filled with bitumen (VFB) were evaluated to determine the optimum binder content (OBC). The results revealed that a B:E ratio of 70:30 at 120°C provided the highest Marshall Stability value of 11.6 kN with acceptable volumetric parameters, demonstrating the optimal balance between strength and workability. The study confirms that emulsion-based WMA can deliver comparable mechanical properties to HMA while enabling lower production temperatures, making it suitable for Indian climatic and economic conditions.
DOI: https://doi.org/10.5281/zenodo.15876538
Designing Inclusive Spaces
Authors: Prof. Gulfam Shaikh, Gauri Gajanan Mankar, Prof. Dilip Jade
Abstract: Art and cultural hubs have emerged as powerful instruments in reshaping urban identities and fostering community engagement. This study explores the role of such spaces in promoting inclusivity while catalyzing urban regeneration. As cities grapple with socio-economic disparities, cultural fragmentation, and deteriorating public spaces, art and cultural centers offer dynamic platforms for interaction, expression, and innovation. These hubs serve not only as venues for creative expression but also as vital nodes that bridge diverse communities, stimulate local economies, and rejuvenate underutilized urban fabric. Through a multidisciplinary lens combining urban design, social theory, and cultural policy, this paper investigates how thoughtfully designed art and cultural hubs can facilitate inclusivity—physically, socially, and economically. It also examines global and local case studies that highlight successful regeneration projects driven by cultural infrastructure. Emphasis is placed on accessibility, participatory design processes, adaptive reuse of heritage structures, and integration with public space networks. The study concludes that inclusive design in cultural hubs—rooted in context, community, and climate—can transform marginalized urban areas into vibrant, equitable, and resilient neighborhoods, ultimately reinforcing a city’s cultural capital and collective identity.
Co2 Emission Rating by Vehicles Using Data Science
Authors: Assistant Professor Mrs.G.Sangeetha Lakshmi, Ms.S.Devagi
Abstract: Amid growing concerns about climate change and environmental sustainability, accurately evaluating vehicle CO₂ emissions has become increasingly important. As transportation remains a major source of greenhouse gases, there is a need for advanced, data-driven solutions to monitor and assess emission levels effectively. This project introduces a deep learning model based on Convolutional Neural Networks (CNN) to classify and rate vehicle emissions by analyzing key attributes such as fuel type, engine capacity, mileage, and emission standards. Unlike traditional rule-based or statistical methods that often struggle with complex and large datasets, CNNs excel at automatically extracting meaningful features, leading to higher prediction accuracy and adaptability. Trained on a comprehensive dataset of vehicle emission records, the model classifies vehicles into various emission categories, offering valuable insights for regulators, manufacturers, and consumers. By combining deep learning with data science, this system provides a scalable and automated method for emissions evaluation, promoting the adoption of energy-efficient vehicles and stricter environmental regulations. Furthermore, the model has the potential for real-time emission monitoring, aiding in better air quality management and supporting the shift toward greener transportation technologies.
The Evolving Landscape Of Digital Currency: Exploring Cryptocurrency’s Role In The Global Economy”
Authors: Dr. Raju Ghanhyam Shrirame
Abstract: The rise of digital currency has transformed the global financial landscape, with crypto currency emerging as a disruptive force in traditional economic systems. Over the past decade, crypto currencies such as Bit coin, Ethereum, and a range of altcoins have challenged conventional banking structures, offering decentralized, borderless, and highly secure financial transactions. This paper explores the evolving role of crypto currency in the global economy, assessing its impact on financial inclusion, regulatory challenges, monetary policy, and the future of digital finance. Crypto currency, built on block chain technology, provides a decentralized method of exchange that eliminates the need for intermediaries like banks and financial institutions. This shift has introduced benefits such as increased financial autonomy, lower transaction costs, enhanced security, and the potential to provide banking services to unbanked populations. However, it also presents risks, including volatility, regulatory uncertainty, security concerns, and its association with illicit financial activities. The adoption of digital currencies by individuals, businesses, and governments reflects the growing recognition of its potential while also raising questions about its long-term sustainability and integration into the global financial system. One of the key drivers of crypto currency adoption is the pursuit of financial inclusion. In many developing regions where traditional banking infrastructure is lacking, crypto currencies provide an alternative means of financial participation. Block chain-based transactions do not require credit histories, bank accounts, or financial intermediaries, enabling people in underserved regions to access financial services. Stable coins, which are pegged to fiat currencies, have gained particular traction as they offer stability while retaining the benefits of digital assets. Cross-border transactions, which are traditionally costly and slow due to banking restrictions, can be executed more efficiently using crypto currencies, making them an attractive solution for remittances and international trade. Despite its advantages, crypto currency remains highly volatile, with price fluctuations influenced by speculation, regulatory developments, technological advancements, and macroeconomic trends. Bitcoin, for example, has experienced extreme price swings, making it an attractive asset for speculative investors but a risky store of value. The volatility of crypto currencies raises concerns about their suitability as a stable medium of exchange and unit of account. Governments and financial institutions remain divided on how to integrate digital currencies into existing financial frameworks without disrupting monetary stability. Regulatory challenges continue to shape the future of crypto currency. Governments and central banks are actively working to establish guidelines for digital assets, balancing innovation with consumer protection and financial stability. While some countries, such as El Salvador, have embraced Bit coin as legal tender, others have imposed strict restrictions or outright bans due to concerns about money laundering, fraud, and tax evasion. The lack of standardized global regulations creates uncertainty, leading to fluctuating adoption rates and investor sentiment. In response, many nations are exploring the development of central bank digital currencies (CBDCs), which offer the efficiency of digital assets while maintaining government control over monetary policy. Institutional adoption has played a pivotal role in legitimizing crypto currency within mainstream financial markets. Major corporations, payment processors, and investment funds have integrated digital assets into their portfolios, acknowledging their potential as a hedge against inflation and an alternative investment vehicle. Financial institutions such as JPMorgan, PayPal, and Tesla have embraced crypto currency, either through direct investment, payment facilitation, or the development of block chain-based services. The emergence of decentralized finance (DeFi) platforms has further expanded crypto currency’s use cases, enabling lending, borrowing, staking, and yield farming without traditional financial intermediaries. The impact of crypto currency extends beyond financial markets, influencing sectors such as supply chain management, digital identity verification, and smart contract automation. Block chain technology provides transparency, traceability, and security, making it a valuable tool for industries seeking to improve operational efficiency. Smart contracts, self-executing agreements powered by block chain, enable automated transactions and reduce reliance on third parties, lowering costs and increasing efficiency in legal and business processes. However, security concerns persist in the digital currency ecosystem. While block chain itself is highly secure due to its decentralized nature, crypto currency exchanges, wallets, and smart contracts are vulnerable to hacking, fraud, and cyber attacks. High-profile exchange breaches and decentralized finance exploits have resulted in significant financial losses, highlighting the need for stronger security measures and regulatory oversight. Additionally, the irreversible nature of crypto currency transactions poses challenges in cases of fraud or accidental transfers, necessitating the development of user-friendly dispute resolution mechanisms. The environmental impact of crypto currency mining has also raised concerns. Bit coin and other proof-of-work (PoW) crypto currencies require substantial energy consumption for transaction validation and network security. Critics argue that this energy-intensive process contributes to carbon emissions and environmental degradation. In response, some block chain networks are transitioning to more sustainable consensus mechanisms such as proof-of-stake (PoS), which significantly reduces energy consumption while maintaining network integrity. The push for eco-friendly block chain solutions aligns with broader global efforts to promote sustainability in digital finance. As the crypto currency landscape continues to evolve, its integration into the global economy will depend on technological advancements, regulatory developments, and mainstream adoption. Governments and financial institutions must collaborate to establish clear guidelines that support innovation while mitigating risks. The potential for crypto currency to reshape global finance is undeniable, but its long-term success hinges on addressing volatility, security, regulatory concerns, and environmental sustainability. In the coming years, we may witness greater convergence between traditional finance and digital assets. The rise of central bank digital currencies (CBDCs) could coexist with decentralized crypto currencies, creating a hybrid financial system that leverages the benefits of both models. Additionally, the adoption of block chain technology in sectors beyond finance, such as healthcare, education, and governance, could further demonstrate its transformative potential. In conclusion, crypto currency represents a paradigm shift in the way we perceive and interact with money. While challenges remain, its disruptive impact on traditional financial systems cannot be ignored. Whether it ultimately serves as a mainstream medium of exchange, a speculative investment, or a decentralized financial infrastructure, its role in the global economy will continue to evolve. Stakeholders—including governments, businesses, and consumers—must navigate this rapidly changing landscape to harness its benefits while mitigating its risks. The future of digital currency is unfolding, and its implications will shape the economic and financial structures of the 21st century.
Performance Evaluation of PET and Glass Fiber Modified High Modulus Asphalt Concrete under Simulated Extreme Indian Climates
Authors: Research Scholar Mr. Satyaveer Dhakad, Assistant Professor Mr. Hariram Sahu
Abstract: This study evaluates the mechanical and durability performance of High Modulus Asphalt Concrete (HMAC) modified with Polyethylene Terephthalate (PET) and Glass Fibers using a PMB 70 binder, under temperature and moisture conditions that simulate extreme Indian climates. Fiber dosages ranging from 0.0% to 0.4% by total mix weight were incorporated into Dense Bituminous Macadam (DBM) grade mixes. A series of laboratory tests, including Marshall Stability, Indirect Tensile Strength (ITS), Dynamic Modulus, and Tensile Strength Ratio (TSR), were conducted on specimens prepared at Optimum Binder Content (OBC) and varying fiber contents. Results indicate that the inclusion of 0.3% Glass Fiber improved Marshall Stability by 21.5% and reduced flow values by 14.2%, suggesting enhanced rutting resistance under high-temperature loading. PET Fiber, at 0.3%, yielded a 27.4% increase in ITS at –10°C, indicating improved resistance to low-temperature cracking. Both fiber types enhanced volumetric parameters, with Void Filled with Bitumen (VFB) exceeding 76% in all modified mixes. Moisture susceptibility improved substantially, with TSR values exceeding 85% for both fiber-reinforced mixes, signifying strong adhesion in the presence of water. The Dynamic Modulus of Glass Fiber mixes increased by 17.8% at 25°C compared to the control mix, while PET-modified mixes exhibited better modulus performance at lower frequencies. These findings suggest that Glass Fiber is more effective in improving high-temperature performance and stiffness, whereas PET Fiber excels in enhancing flexibility, crack resistance, and sustainability. The study supports the use of both fibers in region-specific applications, offering improved pavement life and reduced environmental burden.
Applications Of Radiology In Ophthalomology
Authors: Fiza Shabir Mir
Abstract: Radiology plays a crucial role in ophthalmology by providing non-invasive imaging techniques for diagnosis and managing ocular diseases. The advancements in imaging technologies, modalities such as MRI, CT, USG and Optical Coherence Tomography (OCT) have significantly improved the detection of retinal, orbital and neuro-ophthalmic conditions. This paper explores the applications of radiological imaging in ophthalmology, highlighting recent techniques, advancements and their clinical impact.
Improving Students Assessment in E-learning
Authors: Professor Sakshi M. Rahangdale, Ms. Aachal Harinkhede, Ms. Dolly Raghorte, Ms. Jyoti Patle
Abstract: With the increasing reliance on digital platforms for education, e-learning has emerged as a powerful alternative to traditional classroom instruction. However, assessing students effectively in online environments poses numerous challenges, including issues of academic integrity, student engagement, technological access, and diverse learning styles. This paper focuses on improving student assessment practices in e-learning by exploring both pedagogical and technological innovations. It examines the limitations of conventional assessments like timed exams and emphasizes the need for more adaptive, continuous, and formative evaluation techniques. Advanced tools such as learning analytics, AI-driven assessment systems, and gamified quizzes are discussed for their potential to enhance objectivity and provide real-time feedback. Furthermore, the paper highlights the importance of inclusive assessment strategies that consider learners with varied needs, ensuring accessibility and fairness. It also underscores the need for teacher training in digital pedagogy to effectively design and implement online assessments. Strategies such as peer assessment, project-based evaluation, open-book exams, and scenario-based learning are proposed to make assessments more reflective of real-world understanding. By addressing technological, pedagogical, and ethical dimensions of e-assessment, this study aims to provide a comprehensive framework for educators and policymakers to design more robust, student-centered assessment systems in virtual learning environments. The insights presented can contribute to improving learning outcomes, promoting academic honesty, and increasing student satisfaction in e-learning ecosystems.
Research On Quality Hospice Care Centre In India
Authors: Dr. P. V. Thorat, Prof. Malini Nathe, Prof. Radhika Raut, Prof. Saiyam.S. Chaturvedi, Dr. Sudhir V. Dhomane, Shrinivas Shivaji Pote
Abstract: India’s approach to hospice and palliative care has evolved significantly over the past two decades, driven by a growing need for compassionate end-of-life care amidst an aging population and rising burden of chronic illnesses. This research explores quality hospice care centers across India, highlighting leading institutions such as Karunashraya in Bengaluru, Sparsh Hospice in Hyderabad, the Institute of Palliative Medicine in Kozhikode, Aastha Hospice in Lucknow, and CanSupport in New Delhi. These centers exemplify best practices in hospice care, combining medical support, emotional counseling, and family engagement, often free of cost or heavily subsidized. National efforts, including the development of the Minimum Standards for Palliative Care Programs by Pallium India and NABH accreditation, aim to standardize and improve care quality across centers. Centers recognized by international organizations like the World Health Organization (WHO) and the International Association for Hospice and Palliative Care (IAHPC) demonstrate excellence through community outreach, home care, and professional training initiatives. Despite challenges in accessibility, awareness, and funding, the emergence of such institutions indicates progress toward equitable, patient-centered hospice care in India. Further investment in policy, public awareness, and medical training is essential to scale these models nationwide.
Comparative Efficacy of Diaphragmatic Breathing and Respiratory Muscle Stretch Gymnastics in Interstitial Lung Disease: A Randomized Controlled Trial
Authors: Associate Professor Dr. Vandana, Assistant Professor Dr. Sapna Shokeen
Abstract: Background: Interstitial Lung Disease (ILD) profoundly impacts respiratory function, leading to restrictive ventilatory impairment, diminished respiratory muscle efficiency, and reduced exercise capacity. Two rehabilitative approaches, Diaphragmatic Breathing (DB) and Respiratory Muscle Stretch Gymnastics (RMSG), are commonly employed to enhance respiratory mechanics in these patients. Objective: This study aimed to critically compare the distinct effects of DB and RMSG on crucial clinical parameters, including pulmonary function, severity of dyspnea, respiratory muscle strength, and overall quality of life in individuals diagnosed with ILD. Methods: A single-blind randomized controlled trial (RCT) was conducted involving 60 ILD patients. Participants were equally and randomly allocated to one of two intervention groups: Group A (n=30) received Diaphragmatic Breathing exercises, while Group B (n=30) underwent Respiratory Muscle Stretch Gymnastics. Both interventions were delivered consistently, three times per week for a duration of eight weeks. Key outcome measures included spirometry readings (Forced Vital Capacity [FVC], Forced Expiratory Volume in 1 second [FEV1]), Maximum Inspiratory Pressure (MIP), the modified Medical Research Council (mMRC) Dyspnea Scale, and the St. George’s Respiratory Questionnaire (SGRQ). Results: Post-intervention analysis revealed significant improvements in both groups across various parameters (p<0.05). Notably, RMSG led to more substantial gains in chest expansion, demonstrating a 17.5% increase compared to 9.8% in the DB group (p<0.01), and a greater reduction in dyspnea severity (mMRC score change of 1.4 vs. 0.9). Conversely, DB exhibited superior enhancements in Maximum Inspiratory Pressure, with an average increase of 12 cmH₂O compared to 7 cmH₂O in the RMSG group. Conclusion: The findings suggest that RMSG is particularly effective in improving thoracic mobility and alleviating symptoms of dyspnea, while DB plays a more prominent role in optimizing inspiratory muscle strength. The distinct benefits observed underscore the potential for a combined rehabilitative strategy to offer synergistic and comprehensive improvements for patients with ILD.
Holistic Design Process in Modular Integrated Construction Technology – A Structured Literature Review
Authors: Joachim Zwicky, Sandra Filipe, Fernanda Rodrigues
Abstract: The purpose of the current study is to examine a State-of-the-Art Research (SoTA) in Modular Integrated Construction (MiC) with the analysing focus on a holistic design process in Terms of (A) Design (B) BIM and Digitalization and (C) Sustainability. The identified findings and research gaps will be presented in a structured way and an approach will be outlined to describe a holistic Modular integrated Construction Model by considering all research fields. This paper provides also a framework for practitioners withing the construction industry but especially for start-up companies out of the construction technology segment. The current State of the Art Review was guided by a Systematic Literature Review. The search engines Web of Science was used to find the required literature using a set of key words. In the research and literature review, the relevance and actuality of papers was considered as selection criteria. A total of 54 articles were examined and analysed. The paper examines aspects concerning Modular Integrated Construction (MiC) and sustainable construction practices. It emphasizes the necessity of employing a comprehensive design approach in MiC to enhance sustainability performance through progressive technologies and tactics. The paper underscores the value of utilizing automated generative design systems and advanced simulation methods to choose optimal building layouts, components, and materials in MiC projects. Furthermore, the document highlights the importance of incorporating Design for Manufacture and Assembly (DfMA) principles in MiC to simplify construction processes and streamline work packages. Lastly, it discusses the development of Internet of Things (IoT)-enabled smart Building Information Modeling (BIM) platforms for on-site assembly services in MiC projects. The topic of Modular Integrated Construction Technology with components prefabricated in a factory has already been investigated in many scientific papers. As a rule, the criteria were examined separately or in combination with two criteria such as Design for Manufacturing and Design for Assembly or BIM and Sustainability. With this State-of-the-Art Research, a holistic approach is to be pursued in which the findings and research gaps are structured and provide an evidence-based knowledge in order to conceive an innovative and holistic Modular Integrated Construction Model combing the criteria of Design, BIM and Digitalization and Sustainability. Modular Construction Technology is being discussed at the scientific, political and economic level and often presented as the “game changer” to make housing affordable particularly in urban areas and larger cities around the globe. The new German government e.g., announced by end of 2021 the demand of yearly 400.000 new apartments, mainly in terms of Social Residential Housing; in California, the target is to build 3,5 million new apartments until 2025 and in Hong Kong, around 200,000 people live in shacks of less than two square meters and pay as much as for a room in a shared apartment in Germany. The housing shortage mentioned above is becoming the driver of a necessary transformation in the construction industry. Important cornerstones of this transformation are to make construction faster and less cost-intensive, to make the value creation processes more sustainable and to inter-connect the separate process steps from planning to recycling of building materials or entire construction modules at the end of a house cycle.
Real Time Chat App
Authors: Sarvesh Kumar, Naresh Kumar Kurmi, Rahul Kumar, Suresh Kumar, Shivansh Mishra, Associate Professor Satyarth Tiwari
Abstract: This paper presents the design and implementation of a real-time chat application that facilitates instantaneous communication between users. Leveraging WebSocket technology and a modern web development stack (Node.js, Express.js, and Socket.IO), the system supports low- latency, bi-directional communication. Security, scalability, and user experience are considered in the system architecture. Performance tests indicate the system handles multiple concurrent users effectively with minimal latency.
Blockchain-Enabled Final Seal Verification for Tea Supply Chain Integrity
Authors: Abhijit Kakoty
Abstract: This paper presents a blockchain-based prototype for enhancing traceability and data integrity in the tea supply chain through a Final Seal Verification mechanism. The system integrates Ethereum blockchain (via Ganache and Truffle), Node.js, Solidity smart contracts, and Laravel to ensure the authenticity of critical supply chain events. By hashing and sealing traceability data onto an immutable blockchain, the system can detect tampering attempts and provide customers with a reliable method of batch verification via QR codes.
1.58 Bit Large Language Model(LLM)
Authors: Kumarswamy S, Vidya Laxman Gadekar, Manasi
Abstract: Large Language Models (LLMs) have changed the landscape of natural language processing (NLP) with state of the art performance across numerous applications. Nonetheless, the computational and memory requirements for deployment in resource constrained environments are still a barrier. In this paper we describe the development of a 1.58-bit LLM which utilizes various quality quantization aware tuning and training techniques, and low- rank adaptation (LoRA), with additional memory efficient techniques (e.g., Flash Attention). The LLM quantization methods provide significant savings in both memory and energy consumption and retains competitive accuracy. Our experimental benchmarking demonstrates that effective training and quantization of LLMs can be applied to edge computing and other resource limited deployment methods.The advancement of Large Language Models (LLMs) has significantly transformed natural language processing (NLP) by achieving state-of-the-art results in multiple domains. Nevertheless, the highly computationally and memory-intensive nature of these models makes their deployment in resource-limited settings challenging. In this paper, we introduce the design of a 1.58-bit precision LLM with the state-of-the-art quantization approach and memory-efficient techniques including low- rank adaptation (LoRA) and Flash Attention. The proposed model offers a substantial cut in memory footprint and energy consumption, while maintaining a competitive accuracy. Experimental evaluations on benchmark datasets validate the effectiveness of this approach, demonstrating its applicability in edge computing and other resource-sensitive deployments.
Autochef AI: Multi-Modal Attention For Visual Ingredient Recognition And Recipe Generation From Food Images
Authors: Bhaskara B, Vinith M, Kumarswamy S
Abstract: – Understanding food from images poses marvellous challenge in the region of recipe search, with impactful applications in smart kitchens, dietary monitoring, and automated cooking assistance. Traditional approaches typically handle ingredient recognition and instruction generation as separate tasks, often resulting in incoherent or disjointed outputs. Here, we bring Autochef AI, multi-modal attention toll which seamlessly join visual and textual information to accurately identify ingredients and generate step-bystep cooking instructions from food images. By incorporating attention mechanisms across both image and text modalities, our model captures fine-grained features essential for coherent and contextually grounded recipe generation. Experimental results demonstrate that our approach significantly improves both ingredient prediction accuracy and instruction quality across a wide variety of recipes and cuisines.
DOI: https://doi.org/10.5281/zenodo.16522342
Deep Neural Networks Architecture, Applications, and Challenges
Authors: Mayank Shakkerwal
Abstract: The Deep Neural Networks (DNNS) has revolutionized the field of artificial intelligence by enabling machines to learn from large amounts of data with human-level accuracy in tasks such as image recognition, Natural Language Processing (NLP), and game playing. This Research Paper represents a comprehensive observation of the structure and function of DNN, great advances in their development, popular architecture, real -world applications and major challenges that hinder their widespread adoption. We also highlight future instructions in DNN research, including interpretation, efficiency and ethical/moral implications.
The Effectiveness of Artificial Intelligence Methods in Software Testing: An In-Depth Review
Authors: Dr. Kumarswamy S, Darshith L, Aditya B N, Mohammed Waseem, Nagraj, Nitin Reddy
Abstract: This review explores the effectiveness of Artificial Intelligence (AI) methods in software testing, addressing the high costs and challenges of traditional approaches. It examines key AI and Machine Learning (ML) techniques, their applications in test suite optimization, test case generation, and test case prioritization, highlighting quantitative improvements. The paper also discusses current challenges like data availability and model bias, and outlines future research directions for more adaptable and scalable AI solutions in software engineering.
Assessment Of Pollution Levels Using Biomarkers In Callinectes Sapidus From Estuaries In Rivers State, Nigeria_635
Authors: Doris Ugochi Obinna, Dike Henry Ogbuagu, Enos I. Emereibeole, Chris Chibuzor Ejiogu
Abstract: The increasing anthropogenic pollution in estuarine ecosystems poses a significant threat to aquatic life and ecosystem health. This study aims to assess the pollution levels in selected estuaries of Rivers State, Nigeria, using biomarkers in Callinectes sapidus (blue crab) as an indicator of environmental contamination. In situ measurements for some water quality variables were made at the sampling locations. 48 female crabs (weight 149.20 ± 0.02 g) harvested for the estimation of biomarker levels. Mean concentrations of Total Petroleum Hydrocarbons (TPHs), Polycyclic Aromatic Hydrocarbons (PAHs), Zn and Cr (Sig. values=0.000 each), Cd, Pb, and Fe (Sig. t-values=0.003, 0.019 & 0.009 respectively) were significantly higher at the impacted than reference locations, while that of Monocyclic Aromatic Hydrocarbons (MAHs) and Fe (Sig. t- values=0.032 & 0.014 respectively) differed seasonally at p<0.05. Though there was no significant difference in accumulations of the heavy metals and hydrocarbons in tissues of the heavy metals and hydrocarbons in tissues of the organism, numerical accumulations of Zn (5.73±2.60 µg/g) and TPHs (1.84±1.08 µg/g) were highest in the digestive than the other tissues sampled. Mean levels of Lactate Dehydrogenase (LDH), Alanine Aminotransferase (ALT), Aspartate Aminotransferase (AST), Alkaline Phosphatase (ALP) and Malondialdehyde (MDA) (sig=0.000 each) at the OSD locations, and that of total proteins (Sig. t- value=0.030) in the rainy season were all markedly higher in the organism (p<0.05). Elevated MAHs appeared to induce the production of less ALT (r=-0.584) and AST (r=-0.519), Cr induced the production of less AST (r=-0.513) (p<0.05), while MAHs induced the production of less MDA (r=-0.634) (p<0.01). Lead and PAHs recorded very high Pollution indices (240,000 & 790,000) in sediments, while Zn and TPHs recorded high toxicity quotients of 1.59 and 2.83 in the organism. Allochthonous input of pollutants from petroleum sources into the creek caused biological disruptions, including tissue bioaccumulation and other biochemical disruptions in proteins and enzyme activities of C. sapidus, and these disruptions could rightly infer pollution. Treatment of oily effluents before discharge into the creek is recommended.
DOI: http://doi.org/
Block Chain System _799
Authors: Yashaswini.P, Yathiraj M N
Abstract: This paper provides a comprehensive overview of blockchain technology and explores its application in developing a robust, transparent, and tamper-resistant system to combat the growing issue of counterfeit products in the pharmaceutical industry. Over the past decade, pharmaceutical companies around the world have been grappling with significant challenges in monitoring and tracking their products across the supply chain. These vulnerabilities have created opportunities for counterfeiters to infiltrate the market with fake or substandard medicines, posing serious risks to public health and causing substantial economic losses to legitimate manufacturers. Counterfeit drugs represent a critical global challenge, undermining the integrity of healthcare systems and endangering the lives of millions. These illegitimate products often contain incorrect dosages, harmful ingredients, or no active pharmaceutical ingredients at all, leading to ineffective treatment, prolonged illness, and in some cases, fatal consequences. According to industry statistics, counterfeit drugs are responsible for an estimated $200 billion in annual losses to pharmaceutical companies in the United States alone. Moreover, a World Health Organization (WHO) survey report reveals that in many underdeveloped countries, approximately one out of every ten medicines consumed by patients is counterfeit or of low quality—highlighting the urgent need for a reliable and tamper-proof solution. In response to this pressing issue, our research proposes and implements a blockchain-based drug supply chain management system that leverages the core features of blockchain technology—immutability, decentralization, transparency, and traceability. In conclusion, this research highlights the transformative potential of blockchain technology in securing pharmaceutical supply chains.
DOI:
The Interplay Between Digital Literacy And E-Pedagogy In Education
Authors: Dr. Debasis Ghosh
Abstract: In the evolving landscape of education, digital literacy and e-pedagogy have emerged as critical competencies for both educators and learners. Digital literacy refers to the ability to effectively and critically navigate, evaluate, and create information using a range of digital technologies. E-pedagogy, on the other hand, encompasses the theories and practices of teaching and learning through digital means, emphasizing interactive, student-centered, and technology-integrated approaches. This paper explores the interplay between digital literacy and e-pedagogy, highlighting their role in enhancing educational access, engagement, and outcomes. It examines how educators can develop digital competencies to design inclusive, flexible, and effective digital learning environments. Furthermore, the study investigates challenges such as digital divide, infrastructure limitations, and the need for ongoing professional development. Ultimately, the paper underscores the importance of institutional support, policy initiatives, and pedagogical innovation in fostering a digitally literate and pedagogically adept teaching community capable of meeting 21st-century educational demands.
DOI: https://doi.org/10.5281/zenodo.16257149
Advances In Application Of InP1-xAsx Semiconductor Alloy For Quantum Computing, Quantum Dot Technology, Quantum Photonics, And Spin-based Qubits
Authors: Dr. Alla Srivani Professor
Abstract: The InP₁₋ₓAsₓ (Indium Phosphide-Arsenide) ternary semiconductor alloy plays a significant role in quantum computing and optoelectronics, especially in the context of quantum communication, qubit systems, and quantum dot structures. An essential issue in developing semiconductor devices for photo-voltaics and thermo electric is to design materials with appropriate band gaps plus the proper positioning of do pant levels relative to the bands. Ternary Semiconductor alloys provide a natural means of tuning the magnitude of the forbidden gap for wide Application of Semiconductor devices. The need to provide materials for applications in the long-wavelength range for infrared detectors has led to the development of III-V Ternary alloys of InP₁₋ₓAsx Ternary Semiconductor. InP₁₋ₓAsx III-V Ternary semiconductor is very important as an x of a constituent in the semiconductor is going to have significant changes in calculating Physical Property like Band Energy Gap. These Ternary Compounds can be derived from binary compounds InAs and InP by replacing one half of the atoms in one sub lattice by lower valence atoms, the other half by higher valence atoms and maintaining average number of valence electrons per atom. The subscript X refers to the alloy content or concentration of the material, which describes proportion of the material added and replaced by alloy material. This paper represents the InP₁₋ₓAsx III-V Ternary Semiconductor Band Energy Gap values. Our results agree well with the Available data in the literature.
DOI: https://doi.org/10.5281/zenodo.16258617
Structural Performance Evaluation of a Tall Building with Bracings and Base Isolation Using ETABS Software: A Review
Authors: Vivek Choudhary, Rahul Kumar Satbhaiya
Abstract: The growing demand for resilient high-rise structures in seismic-prone regions has led to the widespread adoption of seismic isolation and bracing systems. This review paper presents a comprehensive evaluation of the structural performance of tall buildings equipped with bracing and base isolation techniques, focusing on their seismic response. Emphasis is placed on research conducted using ETABS software, which provides advanced modeling and analysis tools for evaluating high-rise buildings under seismic loads. Key parameters such as story drift, base shear, lateral displacement, and floor acceleration are examined in the context of various isolation and bracing configurations. The review integrates findings from multiple studies, highlighting the effectiveness of lead-rubber bearings, friction pendulum systems, and bracing systems (X, V, and Z-bracing) in enhancing the lateral stability and energy dissipation capacity of structures. Comparative analyses demonstrate that the combination of base isolation and bracing significantly improves performance compared to conventional fixed-base models. Moreover, the role of soil-structure interaction and building geometry is also discussed to understand their influence on the overall response. This paper concludes that selecting an appropriate seismic control system based on building height, seismic zone, and soil type is critical for optimizing performance. The findings support the continued use of ETABS as a powerful tool for analyzing and designing seismically resistant tall buildings. This review aims to guide engineers, researchers, and designers in selecting efficient seismic mitigation strategies for modern structural systems.
Structural Performance Evaluation Of A Tall Building With Bracings And Base Isolation Using ETABS Software
Authors: Vivek Choudhary, Rahul Kumar Satbhaiya
Abstract: The safety of people inside a building depends on its ability to withstand seismic waves and survive an earthquake with minimal damage and repairs, and without collapsing easily. Various systems are used to absorb seismic energy, including dampers, seismic isolation devices, earthquake-resistant walls, and underground water tanks. The effectiveness of these systems depends on their type and location. In this study, the seismic analysis of a 16-story G+ residential building is carried out based on the analysis of the dynamics of floor shear stress and overturning moment. The ground motion dynamics data are taken from the PEER database. Based on the maximum floor shear stress and maximum overturning moment, the performance of transverse bracing and seismic isolation structures is compared with that of conventional moment structures. By placing these elements at the corners of the building in different models, an efficient and adequate model is obtained.
A Study on Real Time Monitoring of Carbon Emissions Using Building Information Modelling with Ai
Authors: Mr. Ankit Sethi, Abhishek
Abstract: The construction industry contributes approximately 39% of global CO₂ emissions, with embodied carbon—emissions from material extraction, manufacturing, and transportation—accounting for 11%. Traditional life cycle assessment (LCA) tools for estimating embodied carbon are often disconnected from Building Information Modeling (BIM) environments and require manual input, limiting their usability during early design stages. This study presents an AI-integrated BIM framework that enables real-time embodied carbon estimation directly within Autodesk Revit. Using Python-based machine learning models—Random Forest, Gradient Boosting, and Support Vector Regression—trained on structural data extracted via Dynamo, the system predicts carbon values and visualizes results through heatmaps in the Revit model. The Random Forest model achieved the highest accuracy (MAE: 5.4 kg CO₂, R²: 0.93) and outperformed traditional tools like One Click LCA in both speed and precision. The framework enhances decision-making during the design phase and demonstrates strong potential for scalable, automated, and sustainable design practices in the built environment
Integrating Customer Relationship Management (CRM) With Digital Marketing: A Computer Science Perspective
Authors: Ms. Neha Bhat, Mr. Amit Punia
Abstract: The convergence of Customer Relationship Management (CRM) systems with digital marketing techniques has significantly transformed how organizations interact with their customers. In today’s digital economy, data-driven decision-making is essential. By integrating CRM with technologies such as Artificial Intelligence (AI), Machine Learning (ML), and Cloud Computing, businesses can enhance personalization, accurately segment customers, and foster greater loyalty. This paper adopts a computer science-centric approach to examine the architecture, intelligent algorithms, and system integration techniques that enable CRM to serve as an effective digital marketing tool. Real-world case studies from Amazon, Salesforce, and Zoho demonstrate how CRM systems contribute to operational efficiency, improved conversion rates, and long-term customer engagement. A technical framework for AI-enhanced CRM in omnichannel environments is also proposed.
DOI:
Study of Dissimilar Welding Microstructure of Duplex Stainless Steel SFA 2205 with High Strength Low Alloy Steel A387-GR.11 Welded by TIG Process
Authors: Seyyed Moslem Mousavi Khademi, Ali Shafiee, Abbas Najafizadeh
Abstract: In this paper, the dissimilar welding microstructure of the duplex stainless steel SFA 2205 with the high strength low alloy A378 Gr.11 was studied.The microstructure investigations indicated that the weld obtained has a two-phase structure, including dendritic and interdendritic areas. A high hardness transition area was detected in the interface of the A378 low alloy steel and ER 309L metal filler. An unmixed area was observable at the melting boundary of SFA2205 duplex steel and both austenitic and duplex filler metals. The results showed that for joining the two-phase stainless steel SFA2205 with the high strength low alloy A378 Gr.11, using the metal filler ER2209 is more appropriate as a result of forming a more suitable properties microstructure.
REVIEW ON NOMA BASED COMMUNICATION IN 5G SCHEME ON NONLINEAR REAL SIGNAL SVM OFDM SYSTEM.
Authors: Hemant Iklodiya, Madhvi singh Bhanwar
Abstract: Due to massive connectivity and increasing demands of various services and data hungry applications, a full-scale implementation of the fifth generation (5G) wireless systems requires more effective radio access techniques. In this regard, non-orthogonal multiple access (NOMA) has recently gained ever-growing attention from both academia and industry. Compared to orthogonal multiple access (OMA) techniques, NOMA is superior in terms of spectral efficiency and is thus appropriate for 5G and beyond. In this article, we provide an overview of NOMA principles and applications.
DOI:
Developing Explainable Machine Learning Models For Decision Transparency In Healthcare And Finance
Authors: Ms. Roshni Shailesh Gupta
Abstract: Machine learning (ML) models are being widely adopted in high-stakes sectors such as healthcare and finance due to their ability to uncover patterns in data and produce predictive insights. However, many of these models function as opaque “black boxes,” making it difficult for end-users and stakeholders to understand how specific decisions are derived. This lack of interpretability can erode trust, hinder adoption, and raise ethical and regulatory concerns, particularly when decisions affect individuals’ health or financial well-being. Explainable Machine Learning (XML) aims to mitigate these issues by introducing methods that make ML models more transparent and understandable. This paper presents a comprehensive examination of XML techniques, evaluates their implementation across healthcare and finance, and proposes a methodological framework to enhance both accuracy and interpretability in ML systems. The findings highlight that XML is not merely a technical enhancement but a critical enabler of trustworthy, fair, and responsible artificial intelligence (AI).
Agentic AI Systems for Software Development Automation
Authors: Professor Nikita Bante, Professor Uday Mahure, Professor Prajakta Helonde, Professor Radha Yete, Professor Aachal Aakre
Abstract: The advent of Agentic AI systems—AI entities that possess autonomy, contextual awareness, and adaptive learning capabilities—has revolutionized the landscape of software development. Unlike traditional rule-based automation tools, agentic AI can perform high-level cognitive functions, including code generation, optimization, debugging, and collaborative task execution without constant human oversight. This paper explores the role of agentic AI in automating various phases of the software development lifecycle (SDLC), from requirements gathering to deployment and maintenance. The research highlights the growing integration of Large Language Models (LLMs), multi-agent systems, and self-improving codebases. It discusses how these intelligent agents enhance developer productivity, reduce time-to-market, and minimize manual coding errors. Through a blend of empirical evidence, recent technological advancements, and case studies, the study showcases the operational and strategic implications of adopting agentic AI. It further identifies potential challenges, such as security risks, interpretability, over-reliance, and ethical dilemmas. The goal is to contribute to a better understanding of how agentic systems are reshaping software engineering practices and to offer practical recommendations for integrating these tools in development workflows responsibly and efficiently.
Structural Analysis of Power Take-Off Shafts under Operational Loads Using Finite Element Method
Authors: Research Scholar Vaibhav Gajbhiye, Assistant Professor Mr.Praveen Patidar, Assistant Professor Mr.Saurabh Verma
Abstract: This study investigates the structural behavior of a Power Take-Off (PTO) shaft used in agricultural applications under operational loading conditions using Finite Element Analysis in ANSYS. A detailed CAD model of the PTO shaft, including critical geometric features such as splines and fillet transitions, was developed to represent real-world configurations accurately. Structural analysis was performed by applying combined torsional and bending loads to simulate typical field conditions experienced during power transmission to implements. The analysis evaluated equivalent (von Mises) stress distribution, total deformation, and equivalent elastic strain to identify critical stress concentration regions and assess the shaft’s structural integrity. Results indicated that maximum stresses and strains were concentrated near the spline and fillet regions, consistent with areas of geometric discontinuity, while deformation profiles reflected cantilever behavior with the highest displacement at the free end. The study confirmed that the stress, strain, and deformation levels in the PTO shaft made of SAE 4140 remained within permissible limits under applied loads, ensuring operational safety and reliability. These findings provide valuable insights for design validation and optimization of PTO shafts to enhance durability and performance in agricultural machinery.
Geometric Dimensioning and Tolerance (GD&T): Enhancing Precision and Clarity in Engineering
Authors: Ms. Amisha Malviya, Mr. Rahul Khobragade, Mr. Rahul Ghotka
Abstract: Geometric Dimensioning and Tolerance (GD&T) is a standardized symbolic language used on engineering drawings and models to describe the allowable variation in part geometry. Unlike traditional linear dimensioning, GD&T defines permissible limits of imperfection using feature-based tolerance that relates to functional performance. The primary aim is to ensure that components fit together and function reliably under defined operating conditions, regardless of minor variations during manufacturing. This paper presents a comprehensive analysis of GD&T from its historical evolution to practical industry application, its advantages, challenges, and role in modern digital manufacturing. The study also explores how GD&T contributes to precision engineering, reduces rework, enhances productivity, and ensures compliance with global quality standards. Extensive references from international journals, technical standards, and industrial case studies are provided to support the theoretical and practical insights
DOI: http://doi.org/
A Comprehensive Observation of Video Quality Enhancement Using Machine Learning Algorithm
Authors: Kalpana Chaurasia, Sachin chourasia
Abstract: Studies show lots of advanced research on various data types such as image, speech, and text using deep learning techniques, but nowadays, research on video processing is also an emerging field of computer vision. Several surveys are present on video processing using computer vision deep learning techniques, targeting specific functionality such as anomaly detection, crowd analysis, activity monitoring, etc. However, a combined study is still unexplored. This paper aims to present a Systematic Literature Review (SLR) on video processing using deep learning to investigate the applications, functionalities, techniques, datasets, issues, and challenges by formulating the relevant research questions (RQs). One option to enhance the video quality is to change camera lens, which is costly. Thus, an alternative is required. Hence we also proposed convolutional neural network (CNN) may give priority to distinct aspects in an video and differentiate between them. A clear review of the literature on a few CNN-based video enhancing techniques is carried out.
DOI: https://doi.org/10.5281/zenodo.16406507
Folding Algorithms Of Life: Mathematical Insights Into Protein Misfolding, Disorders, And Therapies
Authors: Ms. Meenal Maan, Er. Rajdeep Saharawat, Muskan,, Dr. Vinit Kumar Sharma
Abstract: Proteins must fold into specific three-dimensional structures to function correctly. Errors in protein folding—misfolding—can lead to aggregation and are associated with several degenerative diseases, including Alzheimer’s, Parkinson’s, and Huntington’s. This review explores the molecular mechanisms of protein folding and misfolding, the cellular quality control systems managing these processes, and the pathogenesis of misfolding-related disorders. We also discuss therapeutic approaches aimed at correcting misfolding or enhancing proteostasis.
DOI: https://doi.org/10.5281/zenodo.16407753
Geometric Dimensioning and Tolerance (GD&T): Enhancing Precision and Clarity in Engineering
Authors: Ms. Amisha Malviya, Mr. Rahul Khobragade, Mr. Rahul Ghotkar
Abstract: Geometric Dimensioning and Tolerance (GD&T) is a standardized symbolic language used on engineering drawings and models to describe the allowable variation in part geometry. Unlike traditional linear dimensioning, GD&T defines permissible limits of imperfection using feature-based tolerance that relates to functional performance. The primary aim is to ensure that components fit together and function reliably under defined operating conditions, regardless of minor variations during manufacturing. This paper presents a comprehensive analysis of GD&T from its historical evolution to practical industry application, its advantages, challenges, and role in modern digital manufacturing. The study also explores how GD&T contributes to precision engineering, reduces rework, enhances productivity, and ensures compliance with global quality standards. Extensive references from international journals, technical standards, and industrial case studies are provided to support the theoretical and practical insights
Geometric Dimensioning and Tolerance (GD&T): Enhancing Precision and Clarity in Engineering
Authors: Mr. Rahul Khobragade, Mr. Rahul Ghotkar, Ms. Amisha Malviya
Abstract: Geometric Dimensioning and Tolerance (GD&T) is a standardized symbolic language used on engineering drawings and models to describe the allowable variation in part geometry. Unlike traditional linear dimensioning, GD&T defines permissible limits of imperfection using feature-based tolerance that relates to functional performance. The primary aim is to ensure that components fit together and function reliably under defined operating conditions, regardless of minor variations during manufacturing. This paper presents a comprehensive analysis of GD&T from its historical evolution to practical industry application, its advantages, challenges, and role in modern digital manufacturing. The study also explores how GD&T contributes to precision engineering, reduces rework, enhances productivity, and ensures compliance with global quality standards. Extensive references from international journals, technical standards, and industrial case studies are provided to support the theoretical and practical insights
Geometric Dimensioning and Tolerance (GD&T): Enhancing Precision and Clarity in Engineering
Authors: Mr. Rahul Khobragade, Mr. Rahul Ghotkar, Ms. Amisha Malviya
Abstract: Geometric Dimensioning and Tolerance (GD&T) is a standardized symbolic language used on engineering drawings and models to describe the allowable variation in part geometry. Unlike traditional linear dimensioning, GD&T defines permissible limits of imperfection using feature-based tolerance that relates to functional performance. The primary aim is to ensure that components fit together and function reliably under defined operating conditions, regardless of minor variations during manufacturing. This paper presents a comprehensive analysis of GD&T from its historical evolution to practical industry application, its advantages, challenges, and role in modern digital manufacturing. The study also explores how GD&T contributes to precision engineering, reduces rework, enhances productivity, and ensures compliance with global quality standards. Extensive references from international journals, technical standards, and industrial case studies are provided to support the theoretical and practical insights
DOI: https://doi.org/10.5281/zenodo.16407012
Geometric Dimensioning and Tolerance (GD&T): Enhancing Precision and Clarity in Engineering
Authors: Mr. Rahul Khobragade, Mr. Rahul Ghotkar, Ms. Amisha Malviya
Abstract: Geometric Dimensioning and Tolerance (GD&T) is a standardized symbolic language used on engineering drawings and models to describe the allowable variation in part geometry. Unlike traditional linear dimensioning, GD&T defines permissible limits of imperfection using feature-based tolerance that relates to functional performance. The primary aim is to ensure that components fit together and function reliably under defined operating conditions, regardless of minor variations during manufacturing. This paper presents a comprehensive analysis of GD&T from its historical evolution to practical industry application, its advantages, challenges, and role in modern digital manufacturing. The study also explores how GD&T contributes to precision engineering, reduces rework, enhances productivity, and ensures compliance with global quality standards. Extensive references from international journals, technical standards, and industrial case studies are provided to support the theoretical and practical insights
A Comparative Study on Application of Various Methods in Game Theory
Authors: Kushagra Sharma, Dr Vinit Kumar Sharma, Anjali Goyal
Abstract: In this paper, we have discussed application of various methods ([9],[10]) for solving the problem of a game as Dominance method, Graphical method, Algebraic method, Simplex method etc. Each method has its limitations and benefits, which depends upon the nature of problem. Students may learn about the uses of various methods by study this paper.
Prediction Of Compressive And Splitting Tensile Strengths In Steel Fiber-Reinforced Recycled Aggregate Concrete Using Machine Learning And PSO Optimization
Authors: Yassine Dahbi, Hamza Naciri, Hamza Zaouri, Ouahib Alaoui
Abstract: This study examines the use of GradientBoostingRegressor, StackingRegressor, and Gradient Boosting Regression with HistGradientBoosting in developing models that predict the compressive strength (fcu) and splitting tensile strength (fsp) of steel fiber-reinforced recycled aggregate concrete (SFR-RAC). The information comprises 465 compressive strength and 339 splitting tensile strength data of concrete mixes with varied ratios. Training and model testing were performed using 80/20 split with PSO for the hyperparameter optimization. The performance of the model was measured with four statistical metrics: coefficient of determination (R²), mean absolute error (MAE), root mean squared error (RMSE), and mean absolute percentage error (MAPE). Out of the models, Gradient Boosting Regression with HistGradientBoosting performed better in terms of prediction, with StackingRegressor taking the second rank. SHapley Additive exPlanations (SHAP) and feature importance were employed to determine the influence of input parameters on model predictions. From the results obtained, it was evident that the water content, cement content, and fiber ratio influence considerably the strength of SFR-RAC. The models give good insights regarding SFR-RAC mixture behavior, which is helpful in the production of environmentally friendly concrete with greater enhanced strength. Future research can enhance the data and use other predictor variables to further support these models.
DOI: https://doi.org/10.5281/zenodo.16420152
Representation of Nature in Indian Advertising Logos
Authors: Assoc. Prof. Swati Mehta, Dr. Ujjvala M. Tiwari
Abstract: For hundreds of years, creators and designers have looked to nature for inspiration. Sustainability and environmentalism have led to the widespread use of natural textures, shapes, and colours in many areas of design. More lately, graphic designers have been unable to ignore the beauty of mother nature as a source of inspiration, in contrast to paintings, sculpture, architecture, and textiles. According to a number of studies, logos that represent real-world objects—such as plants, animals, or locations—require less processing work than abstract ones because they are easier to recognize. Because they appeal to a particular target demographic and offer a personal touch, animals are a popular symbol for logos. For instance, the image of the lion, who rules the jungle, stands for power, strength, bravery, and justice. On the one hand, jewellery logos use the beauty and grace of a swan, while logos that use lions may symbolize a brand’s strength or authority within its industry. Certain companies’ logos use plants, trees, and flowers to symbolize life, growth, creativity, freedom, harmony, prosperity, value, and tranquillity. Unilever’s emblem features 25 natural symbols: a lion and palm tree representing the RBI, a galloping horse for TVS Motor Company, a soaring swan with the Konark Chakra for Air India, and a banyan tree for Dabur India Limited. The current study focuses on how different types of nature are portrayed in logos for Indian advertising firms.
DOI: http://doi.org/10.5281/zenodo.16408435
Synthesis and Characterization of Polymer-Metal Chelates Derived from Oxalic Acid and Thiosemicarbazide
Authors: Professor Fiza Pathan, Professor Hirkanya Bhole, Professor Nageshwari Sarade, Professor Sushma Borewar, Professor Bhavesh Thakre
Abstract: Polymeric materials have become increasingly significant due to their wide-ranging applications and adaptability to modern societal needs. These materials exhibit remarkable properties, including thermal stability, chemical resistance, conductivity, and ion-exchange capacity. The development of polymeric ligands and coordination polymers, particularly those containing donor atoms such as N, S, and O, has expanded their utility in various fields, including catalysis, electronics, surface coatings, and biomedical applications. Chelate polymers, formed by coordinating metal ions with organic ligands, exhibit both organic and inorganic characteristics, offering desirable magnetic, thermal, and electrical properties. Despite challenges such as poor solubility and plasticity, chelate polymers are utilized in the aerospace, automotive industries, and semiconductor sectors due to their thermal resilience and functional diversity. Recent research has focused on designing low-band-gap conducting polymers and synthesizing various metal complexes with Schiff bases, hydroxamic acids, and thiosemicarbazones, leading to advances in coordination chemistry and material science. These developments underline the transformative role of polymers in science, industry, and everyday life.
Phishing Website Detection Using Machine Learning
Authors: Udith A, Harsha H S, Jayanth B R, Prathibhavani P M
Abstract: – In today’s fast-changing digital environment, phishing attacks are a major cybersecurity concern. These attacks use deceptive messages to trick users into revealing sensitive information or installing harmful software. Historically, such attacks have involved widespread spam campaigns that target many users with malicious URLs or files designed to bypass standard security measures. To address the increasing so- phistication of these threats, this research introduces an intelligent, real-time framework for detecting phish- ing URLs using machine learning. A gradient boosting classifier was specifically chosen to systematically examine and distinguish phishing URLs from legitimate ones. The approach relies on a broad suite of lexical, structural, and host-based feature extraction. The classifier outperforms traditional methods—including support vector machines, decision trees, random forests, and neural networks—demonstrating both higher accuracy and lower false positive rates. These results validate the system’s capacity for timely and effective phishing detection. The work underscores the promise of sophisticated machine learning methods for enhancing digital trust and reinforcing cyber defense architectures.
DOI: http://doi.org/10.5281/zenodo.16408435
Ai Powered Medikit An AI System With Miscellaneous Medical Applications
Authors: Praveen, Bhaskarm
Abstract: Healthcare is the foundation of a well-functioning society, yet billions especially in rural, remote, or economically disadvantaged regions continue to face limited or delayed access to qualified medical support. In such contexts, Artificial Intelligence (AI) is not just a technological advancement but a transformative tool for democratizing healthcare services. The AI-Powered MediKit is designed as a comprehensive, intelligent, and accessible digital healthcare assistant that empowers users with preliminary diagnostics, wellness support, and actionable medical guidance from the comfort of their homes. Rather than being a single-purpose tool, MediKit is a modular, multi-functional platform that integrates advanced AI technologies such as image processing, speech and audio analysis, natural language understanding, and knowledge-based recommendation systems. Each module is carefully developed for scalability, usability, and inclusivity ensuring that the system is effective across diverse user groups. MediKit bridges the healthcare gap by making intelligent, early-stage diagnostics available on everyday devices, thus contributing to more equitable and proactive health management
DOI:
Humour vs. Knowledge: The Impact of Memes on Public Opinion in the Context of Truth
Authors: Arina Das
Abstract: In today’s digital ecosystem, memes have become influential toolsIof communication, especially in the realm of celebrity culture. While their humour and virality attract widespread engagement, this study investigates how such content can distort, simplify, or even erase factual understanding. Focusing on a three month period (October to December 2024), the research examines high engagement celebrity memes on Instagram and Facebook using a Critical Discourse Analysis (CDA) framework. By analyzing both the visual textual elements of the memes and audience responses in comment sections, the study uncovers how humour frequently overshadows truth, reducing complex public narratives to easily consumable jokes. Findings reveal recurring patterns of objectification, ageism, slut shaming, and misinformation, often masked as harmless entertainment. Audience engagement further reinforces these distorted portrayals, indicating a cultural shift where emotional appeal takes precedence over informed discourse. The research also highlights how platform algorithms favour humorous, sensational content, amplifying its reach regardless of its accuracy or ethical implications. While humour can be powerful medium for critique and resistance, its overuse, particularly when misapplied, poses significant risks to public understanding. This study emphasizes the urgent need for critical media literacy and ethical content creation in order to preserve the integrity of information in an era whereIhumour increasingly dominates digital storytelling.
Neuroarchitecture in Incubation Centers: Designing Spaces That Think
Authors: Danish Khan, Professor Ar. Dilip Jade, Professor Ar. Gulfam Shaikh
Abstract: In the ever-evolving landscape of innovation, architecture is no longer just a backdrop; it is an active participant in shaping human thought, behavior, and performance. Neuroarchitecture—a discipline at the intersection of neuroscience and architectural design—explores how spatial environments influence brain function and cognition. As startup culture thrives and the demand for incubation centers increases, understanding the neurological impact of spatial design becomes vital. This research investigates the principles of neuroarchitecture and their application in designing incubation centers that foster creativity, productivity, collaboration, and psychological well-being. The paper highlights the science behind neuroarchitectural strategies and presents design guidelines and case studies that illustrate how spaces can be programmed to “think” with their users.
TruthLens: A System For Stock Market News Analysis And Fake News Detection Using BERT
Authors: Shreyash Akole, Mansi Rakhonde, Saish Desai
Abstract: The rapid growth of online news and digital media has significantly impacted financial markets, where even a single headline can influence investor behavior. With this increasing dependence on news, ensuring the authenticity and sentiment of financial information has become more important than ever. This research presents a dual-purpose system that combines stock market news sentiment analysis with fake news detection. Our model aims to solve this by using Natural Language Processing (NLP) techniques and Machine Learning (ML) algorithms to analyze financial news, detect fake information, and suggest investment actions such as Buy, Sell, or Hold. This system uses Bidirectional Encoder Representations from Transformers (BERT) and Long Short-Term Memory (LSTM) models for sentiment and authenticity analysis, ensuring reliability and accuracy. The framework empowers traders, investors, and institutions to make smarter, safer financial decisions by combining sentiment analysis and fake news detection into a single platform. Most systems only do either sentiment analysis or fake news detection. Our system does both in one place, making it more reliable. It reached 93% accuracy in sentiment analysis and 96% in fake news detection, helping users make better and safer financial decisions.
DOI:
Adaptive Design Of Overwater Villas For Rising Sea Levels
Authors: Sharyu Chinchole
Abstract: As climate change continues to reshape coastlines and alter ecosystems, architecture stands at a crossroads—between crisis and creativity. This research explores the architectural response to rising sea levels through the lens of adaptive overwater villas. Traditionally associated with luxury, overwater villas are reimagined here as resilient, climate-responsive habitats that float, adapt, investigates and endure. The study floating systems, modular strategies, sustainable design tools, and precedents from across the globe, aiming to develop a contextual, feasible solution for future-ready living. The final proposition blends engineering and environmentalism into a design that is not only functional but poetic—an architecture that floats with the planet, not against it.
DOI:
GREEN BUILDING
Authors: Sakshi Gajanan Jayale, Jayant Ingole, Saiyam.S. Chaturvedi, Saiyam.S. Chaturvedi, Dr. Sudhir V. Dhomane, Dr. P. V. Thorat
Abstract: In the wake of escalating climate change and environmental degradation, green building technologies have emerged as essential strategies in sustainable development. This seminar explores the relationship between built environments and the natural ecosystem, emphasizing the pivotal role of buildings in energy consumption, carbon emissions, and global warming. The study categorizes green technologies that enhance energy efficiency, occupant comfort, and environmental performance through passive and active systems such as thermal mass utilization, natural ventilation, stack effect, daylighting, and high-performance glazing. It also underlines adaptive strategies to respond to climate projections using mitigation and resilience-based approaches. The report introduces a structured methodology for integrating these technologies into new and existing buildings, highlighting key principles like integrated design, responsive planning, performance benchmarking, and climate adaptation. By bridging theoretical frameworks with practical design applications, this research reinforces the potential of green building strategies to minimize ecological footprints while ensuring productivity, health, and comfort within the built environment.
DOI:
Healing Through Nature: A Study Of Biophilic Elements In Rehabilitation Environments
Authors: Aliya Fulara
Abstract: Biophilic design is a powerful architectural and psychological approach that reconnects humans with nature. This study investigates the impact of biophilic elements in rehabilitation environments, particularly in centers aimed at treating various forms of addiction and psychological stress. Through the integration of natural materials, daylight, vegetation, water features, and views of nature, biophilic design can significantly improve patient recovery rates, reduce stress, and enhance emotional well-being. The paper explores how spatial design rooted in biophilic principles—such as natural ventilation, organic forms, and immersive landscapes—can promote healing. It also examines case studies and concludes with recommendations for designing restorative spaces that prioritize the human-nature connection. This research aims to guide architects, planners, and healthcare professionals in creating rehabilitation spaces that truly support holistic recovery. It explores global examples and proposes strategies to integrate these principles in future rehabilitation architecture. The results affirm the positive impact of nature on healing, mental restoration, and long-term patient engagement.
Predicting Food Wastage In Nepal Using Linear Regression: An Empirical Assessment Towards Achieving SDG 12.3
Authors: Dr. Saumendra Mohanty
Abstract: – Food wastage contributes significantly to global environmental degradation and economic inefficiencies, particularly in developing countries. This study investigates the trend and forecast of per capita food wastage in Nepal using a simple Linear Regression model applied to UNEP and FAO datasets (2015–2021). The findings assess the nation’s trajectory against the Sustainable Development Goal (SDG) 12.3, which aims to halve food waste by 2030. The analysis reveals that Nepal is not on track to meet this target unless intervention strategies are urgently adopted. We also discuss the potential of data-driven policymaking to guide national sustainability
Custom Gpt System
Authors: Prathibhavani P M, Pragathi S
Abstract: We propose a Custom GPT System that allows domain-specific adaptation of a large language model through a flexible, modular pipeline. The system uses a Python Flask REST API as the backend server, facilitating a microservices-style deployment of core components. A high performance Groq API endpoint executes inference on Meta’s LLaMA 3 model (available in 8B and 70B parameter sizes) to generate responses. Crucially, our pipeline integrates a Retrieval Augmented Generation (RAG) stage: incoming queries trigger semantic retrieval from a vector store of domain documents, and the returned context is combined with the user prompt to guide generation. Prompts and model settings are parameterized via human-readable YAML configuration files, enabling easy customization of system behaviour and personality. This architecture can be applied to sectors like healthcare, education, and customer support by supplying relevant documents and prompt templates. We describe the system architecture (see Fig. 1), implementation details, and an API-based deployment strategy. In evaluation, the Groq-accelerated LLaMA 3 achieves up to 18× throughput improvements versus a GPU baseline, and the RAG component markedly reduces hallucinations by grounding output in up-to-date knowledge.
DOI: http://doi.org/10.5281/zenodo.16444282
Design Considerations for Schools for Children with Autism Spectrum Disorder
Authors: Vaidehi Rajkumar, Ar. Dilip Jade, Ar. Gulfam Shaikh
Abstract: Children with Autism Spectrum Disorder (ASD) face a wide range of sensory, behavioural, and social challenges that deeply influence how they experience school. Most traditional learning environments do not meet their specific needs, which can lead to anxiety, sensory overload, and difficulties with learning and socialization. This paper discusses practical design strategies that can make schools more inclusive and supportive for children with ASD. By focusing on sensory-friendly spaces, clear layouts, flexible areas, and safe opportunities for social interaction, we can create environments that help students feel more comfortable and engaged. Drawing from research and global case studies, this paper highlights the importance of thoughtful, adaptable, and human-cantered design in supporting the development of autistic students
DOI: http://doi.org/10.5281/zenodo.16444282
Structural Evaluation Of Tall Buildings Using Ferroconcrete And Steel-Timber Hybrid Systems Under Lateral Loads In ETABS: A Review_655
Authors: Pavan Patel, Rahul Kumar Satbhaiya
Abstract: This paper reviews the structural evaluation of tall buildings using Ferro concrete (reinforced concrete) and steel-timber hybrid systems under lateral loads, emphasizing seismic and wind forces. With the growing need for sustainable and high-performance building materials, steel-timber hybrids present a viable alternative to conventional RCC systems. Using insights from past research and simulation tools like ETABS, the study analyses lateral load performance, energy dissipation, environmental benefits. The review highlights key developments, identifies research gaps, and suggests future directions for adopting hybrid systems in Indian structural design
Structural Evaluation Of Tall Buildings Using Ferroconcrete And Steel-Timber Hybrid Systems Under Lateral Loads In ETABS
Authors: Pavan Patel, Rahul Kumar Satbhaiya
Abstract: In a densely populated and rapidly urbanizing country like India, the construction of multistorey buildings is essential to meet increasing housing and infrastructure demands. These structures are often subjected to lateral forces, which act horizontally and can significantly affect their performance. Such forces primarily include wind loads and seismic activity, both of which can cause structural deflection, instability, or even failure if not adequately addressed. To ensure buildings can withstand these lateral effects, advanced structural analysis tools like ETABS are employed. ETABS is a comprehensive software used for modeling, analysis, and design of buildings in compliance with national and international codes. It provides accurate simulations of how structures respond to various loading conditions, making it ideal for evaluating complex buildings. Hybrid structural systems, which combine different construction materials, are gaining popularity for enhancing structural performance. One common and effective hybrid combination is steel and concrete, which leverages the tensile strength of steel and the compressive strength of concrete. Steel is especially favoured for its high load-carrying capacity and fast construction capabilities, and when integrated with other materials such as timber or ferrocement, it helps overcome limitations inherent to individual materials. This study investigates a G+7 storey steel-timber hybrid building reinforced with ferroconcrete (wire mesh embedded in concrete). A three-dimensional static analysis is performed using ETABS to assess the building’s structural behavior. The study focuses on key parameters such as maximum nodal displacement, bending moment, shear force, axial force, and storey drift to evaluate the effectiveness of the hybrid system under lateral loading conditions
DOI: https://doi.org/10.5281/zenodo.16595886
Hybrid Cnn-Gru Model with Residual Connections for Multi-Class Fault Detection In Industrial Systems
Authors: S.Radha Krishnan, Assistant Professor,M.Aishwarya, Dr.R.Natarajan
Abstract: Fault detection in industrial systems is crucial for ensuring operational safety, minimizing downtime, and reducing maintenance costs. This work proposes a hybrid deep learning model combining Convolutional Neural Networks (CNN) and Gated Recurrent Units (GRU) to detect and classify machine faults from time-series data. The CNN layers extract spatial features, while GRU layers model temporal depen,dencies in the data. The architecture incorporates residual connections to enhance gradient flow and improve learning efficiency. The model is evaluated on multi class fault detection datasets, achieving robust performance with high accuracy, precision, recall, and F1-score. Advanced metrics, including ROC-AUC, logarithmic loss, Cohen’s Kappa, and Matthews Correlation Coefficient, demonstrate the model’s reliability. Visualization of confusion matrices and detailed performance metrics validates its effectiveness in detecting anomalies and classifying fault types. This approach can be generalized for real-time monitoring systems in various industrial applications, ensuring predictive maintenance and operational excellence.
Loading Analysis of Two Different Light Weight Steel Structure and Observations It’s Loading
Authors: Rajnandanikoshti, Assistant professor Anubhav Rai
Abstract: The Calibration and testing of light weight structure are very important part for building design. However proposed work is belongs for a different types of steel structure analysis. In this proposed steel structures main focus to reduce of total weight and easily design. Now these research work identification of best light weight structures using staad pro software. The main object of this work design a steel frames and compare both structures
Assessment Of Combined Antibacterial Activity Of Tridax Procumbens And Lantana Camara Leaf Extracts.
Authors: Maurya Kiran M, Gadekar Mayuri N, Davkhar Kalyani U, Kadbhane Ashwini S
Abstract: This study investigated antibacterial activity of two medicinal plants: Tridax procumbens, and Lantana camara; they are common weeds often found in cultivated areas and wastelands, and hold significant importance in traditional Indian medicine. These plants are rich in a diverse array of secondary metabolites, including tannins, flavonoids, alkaloids, saponins, phenols, steroids, anthocyanins, proteins, and carbohydrates. These secondary metabolites are extracted using Soxhlet’s method and ethanol as a solvent. Screening of plants’ phytochemicals is done as per standard protocol. In vitro, antimicrobial activity was assessed using the well diffusion method. The antibacterial activity of Ethanol extracts of these two plant leaves was evaluated against disease-causing bacteria, including Staphylococcus aureus., Escherichia coli, Pseudomonas aeruginosa, Streptococcus sp., and Klebsiella pneumoniae Combined ethanolic plant extracts exhibited maximum antibacterial activity compared to the individual plant extracts.
DOI:
Review On Stabilization Of Clayey Soil Using Industrial Waste Products
Authors: Neha Dongre, Dr.SunilSugandhi
Abstract: Soil stabilization can be explained as the alteration of the soil properties by chemical or physical means in order to enhance the engineering quality of the soil. The main objectives of the soil stabilization is to increase the bearing capacity of the clay soil, it’s resistance to weathering process and soil permeability. The long-term performance of any construction project depends on the soundness of the underlying soils. Unstable clay soils can create significant problems for pavements or structures, Therefore soil stabilization techniques are necessary to ensure the good stability of clay soil so that it can successfully sustain the load of the superstructure especially in case of clay soil which are highly active, also it saves a lot of time and millions of money when compared to the method of cutting out and replacing the unstable soil. This paper deals with the complete analysis of the improvement of clay soil properties and its stabilization using industrial waste sand and lime. The experimentation is carried out keeping 20% of lime as constant and industrial waste sand 10%, 20%and 30%. Disposal of these waste materials is essential as these are causing hazardous effects on the environment. With the same intention literature review is undertaken on utilization of solid waste materials for the stabilization of soils and their performance is discussed
The Future of Cybersecurity and the Challenges It Presents
Authors: Minal Jain, Mr.Aaron Dlima
Abstract: In recent years, the Internet has become an important part of daily life for people all over the world. On the other hand, with the increase in online activities, cyber crimes are also increasing. Cybersecurity has made great strides in recent years to keep up with the rapid changes in cyberspace. Cybersecurity refers to the procedures a country or organization can use to protect its assets and information in cyberspace. Twenty years ago, the term “cybersecurity” was not recognized by the public. Cybersecurity affects not only individuals but also organizations or governments. In recent years, everything has become digital using various technologies such as cybernetics, cloud computing, smartphones and Internet of Things technology. Cyber attacks raise concerns about privacy, security and financial compensation. Cybersecurity is a set of technologies, processes and practices designed to prevent attacks, crimes and unauthorized access to networks, computers, processes and system information. The main purpose of this article is to provide an overview of cyber security types, why cyber security is important, cyber security methods, cyber security tools and cyber security issues. Cybersecurity protects the data and integrity of the organization’s network or computing assets to protect them from threats throughout the life of a cyber attack.
Topic:Integration Of IoT With AI And ML
Authors: Sm Hassab Qais
Abstract: In recent years, the Internet has become an important part of daily life for people all over the world. On the other hand, with the increase in online activities, cyber crimes are also increasing. Cybersecurity has made great strides in recent years to keep up with the rapid changes in cyberspace. Cybersecurity refers to the procedures a country or organization can use to protect its assets and information in cyberspace. Twenty years ago, the term “cybersecurity” was not recognized by the public. Cybersecurity affects not only individuals but also organizations or governments. In recent years, everything has become digital using various technologies such as cybernetics, cloud computing, smartphones and Internet of Things technology. Cyber attacks raise concerns about privacy, security and financial compensation. Cybersecurity is a set of technologies, processes and practices designed to prevent attacks, crimes and unauthorized access to networks, computers, processes and system information. The main purpose of this article is to provide an overview of cyber security types, why cyber security is important, cyber security methods, cyber security tools and cyber security issues. Cybersecurity protects the data and integrity of the organization’s network or computing assets to protect them from threats throughout the life of a cyber attack.
Learning Analytics Usability Factors Towards Learner Performance Assessment – A Review Of Literature
Authors: Mohammed Swaleh Mohammed, Alice Nambiro, Bostley Muyembe Asenahabi, Eric Sifuna
Abstract: The Covid-19 pandemic significantly accelerated the adoption of e-learning, resulting in increased investment in digital education platforms and the generation of vast amounts of learner data, commonly referred to as big data. This shift has propelled the development of learning analytics, a field focused on the systematic collection, analysis, and reporting of data to better understand learner behavior and enhance educational outcomes. Despite growing interest in learning analytics, there remains limited understanding of its usability, particularly in assessing student performance within virtual learning environments. This paper presents a critical literature review examining the usability factors of learning analytics in the context of Kenyan universities. Using a desktop research approach, the study synthesizes existing research to identify key gaps and highlight usability dimensions relevant to performance assessment. The review identifies four primary factors influencing the usability of learning analytics tools: perceived usefulness, perceived ease of use, user satisfaction, and learnability. These findings provide a foundation for future empirical studies aimed at evaluating the impact of these factors on learner performance in virtual settings.
Microplastic Pollution In Oceans: Detection And Removal Methods
Authors: Nanthini.N.L, Sheeja.B
Abstract: Microplastics are tiny plastic pieces that are less than 5 mm in size. These small plastics enter the oceans through the breakdown of larger plastic waste or through products like face wash and toothpaste. They pose threats to marine life and potentially human health. This paper explains how microplastics are detected and removed, including advanced technologies such as spectroscopy and AI, as well as the challenges involved and the need for improved solutions.
Ai and Ai Robotics in Terrestrial and Extraterrestrial Architecture
Authors: Azar Djamali
Abstract: This study reviews the application of Artificial Intelligence (AI) and robotic tools in terrestrial and extraterrestrial architecture. The rapid advancement of AI and robotics is transforming the architectural profession, enhancing creative exploration through data aggregation and analytics while complicating decision-making due to the vast datasets involved. Architects must balance technical proficiency with strategic discernment to harness AI’s potential while addressing operational challenges. AI significantly impacts architectural design throughout the building lifecycle, influencing predictive analytics, construction supervision, and facility maintenance (Yangluxi Li et al.). As humanity moves toward interplanetary colonisation, architects are designing habitable structures for extreme environments. Notable projects include the Mars Habitat Concept by Foster + Partners and NASA (2015–2016), Mars Science City by Bjarke Ingels Group (BIG) for the UAE (2017–present), and Skidmore, Owings C Merrill’s (SOM) Moon Village concept, featuring inflatable modules, showcased at the Venice Architecture Biennale 2025. AI and AI-assisted robots, such as the Perseverance rover, enable adaptive and efficient design for harsh environments. This article synthesises insights from over 100 research papers, exploring how machine learning (ML), generative design, and autonomous systems will redefine extraterrestrial construction. By optimising structural integrity and resource utilisation, AI facilitates spatial optimisation and energy-efficient designs (Yangluxi Li et al., 2025). This study highlights the roles of AI, robotics, and automation in optimising terrestrial and extraterrestrial architecture, presenting scalable solutions for future colonies and providing actionable insights for space agencies and architects. By leveraging AI and robotic technologies, architecture can adapt to extreme environments and advance innovative designs for future inhabitants.
Review Role of Ai in Early Detection and Treatment of Cardiovascular Diseases
Authors: Kiran Kumar, V. Narmada, Sagarika Kulkarni
Abstract: Cardiovascular diseases cause morbidity and mortality. Early detection is critical, as it can detect and the use of AI plays a significant role in and impacts cardiovascular diseases. Since cardiovascular diseases (CVDs) continue to be the world’s leading cause of death, improvements in early diagnosis, treatment, and management techniques are vital. The integration of artificial intelligence (AI), machine learning, and deep learning into cardiovascular medicine offers promising avenues to improve patient outcomes. This review explores recent progress in AI applications for CVDs, including automated electrocardiogram (ECG) analysis, medical imaging, wearable sensor technologies, and telemedicine. AI-driven systems have demonstrated potential in enhancing diagnostic accuracy, enabling remote monitoring, predicting disease risk, and supporting clinical decision-making. Despite significant advancements, challenges such as data bias, algorithmic fairness, and the need for rigorous clinical validation remain. Continued research and the responsible deployment of AI technologies can help address the global burden of CVDs through more precise, efficient, and personalized care
DOI: https://doi.org/10.5281/zenodo.16719088
Comparative Seismic Performance Analysis Of Rectangular And Circular Columns: Effects Of Replacing Rectangular Columns With Circular Columns In RC Structures
Authors: Rahul Solanki, Murlidhar Chourasia, Rahul Kumar Satbhaiya
Abstract: Designing earthquake-resistant structures necessitates ensuring their safety throughout both the construction phase and their operational lifespan, regardless of the intensity or frequency of seismic events. Ground motion effects are particularly complex due to their dynamic and multifaceted impact on structural behaviour. Among the various analytical techniques available, pushover analysis is considered one of the most dependable for evaluating a structure’s response to intense seismic forces. This approach is based on the assumption that during seismic events, buildings predominantly respond in their fundamental or lower vibration modes. Therefore, a multi-degree-of-freedom (MDOF) system can be effectively transformed into an equivalent single-degree-of-freedom (ESDOF) model. This ESDOF model is derived using nonlinear static analysis and subsequently subjected to nonlinear time history or response spectrum analysis utilizing constant-ductility or damped response spectra. The seismic demand parameters obtained from the ESDOF model are then mapped back to the MDOF system through modal transformation techniques. In this research, the seismic behaviour of a moment-resisting RC frame structure is analyzed using the pushover method. The study focuses on the effect of changing column shapes and sizes, specifically replacing rectangular columns with circular ones, to assess variations in seismic performance. The static pushover approach incorporates both constant gravity loads and incrementally applied lateral loads at each story level. Capacity curves, illustrating the relationship between base shear and total story drift, are generated using ETABS 2015 to extract critical seismic response parameters. Throughout the simulation, the overall plan dimensions of the building are maintained constant, while only the column dimensions are varied. Three sets of rectangular column sizes are examined for their nonlinear seismic response and subsequently compared with equivalent circular columns. All structural models are designed following the provisions of IS 456:2000 for concrete design and IS 1893:2002 for seismic loading conditions..
DOI: http://doi.org/
Comparative Seismic Performance Analysis Of Rectangular And Circular Columns: Effects Of Replacing Rectangular Columns With Circular Columns In RC Structures: A Review
Authors: Rahul Solanki, Murlidhar Chourasia, Rahul Kumar Satbhaiya
Abstract: The seismic performance of reinforced concrete (RC) structures is significantly influenced by the geometry and detailing of their columns, which act as the primary load-bearing and energy-dissipating elements during ground motion. Among the available column cross-sections, rectangular shapes have traditionally been used due to ease of construction and integration with architectural plans. However, in high-seismic zones, circular columns are gaining attention for their superior ductility, confinement efficiency, and uniform stress distribution.This study presents a comparative analysis of the seismic behavior of RC frames with rectangular and circular columns, focusing on their performance under lateral loading conditions. An extensive literature review was conducted, highlighting the influence of column shape on key seismic performance parameters such as base shear capacity, lateral drift, energy dissipation, plastic hinge formation, and failure mechanisms. Analytical studies, experimental investigations, and code-based assessments consistently indicate that circular columns outperform rectangular ones in terms of ductility and post-yield behavior. Their symmetrical geometry allows better confinement of core concrete, resulting in enhanced resilience during strong ground shaking.Additionally, the review underscores the limitations of conventional force-based design methods in accurately predicting the inelastic behavior of structures, especially those with irregular geometries. Nonlinear static (pushover) analysis, displacement-based approaches, and accurate modeling of infill-wall interaction emerge as essential tools for realistic seismic performance evaluation.The findings of this study support the strategic replacement or incorporation of circular columns in RC frames to improve seismic resistance, particularly in retrofitting and performance-based design scenarios. While practical challenges such as increased formwork complexity exist, the benefits in structural safety and energy absorption justify their use. This paper serves as a reference for engineers and researchers aiming to optimize RC structures for improved seismic resilience through column geometry selection.
DOI: https://doi.org/10.5281/zenodo.16728432
Comparative Seismic Performance Analysis Of Rectangular And Circular Columns: Effects Of Replacing Rectangular Columns With Circular Columns In RC Structures: A Review
Authors: Rahul Solanki, Murlidhar Chourasia, Rahul Kumar Satbhaiya
Abstract: The seismic performance of reinforced concrete (RC) structures is significantly influenced by the geometry and detailing of their columns, which act as the primary load-bearing and energy-dissipating elements during ground motion. Among the available column cross-sections, rectangular shapes have traditionally been used due to ease of construction and integration with architectural plans. However, in high-seismic zones, circular columns are gaining attention for their superior ductility, confinement efficiency, and uniform stress distribution.This study presents a comparative analysis of the seismic behavior of RC frames with rectangular and circular columns, focusing on their performance under lateral loading conditions. An extensive literature review was conducted, highlighting the influence of column shape on key seismic performance parameters such as base shear capacity, lateral drift, energy dissipation, plastic hinge formation, and failure mechanisms. Analytical studies, experimental investigations, and code-based assessments consistently indicate that circular columns outperform rectangular ones in terms of ductility and post-yield behavior. Their symmetrical geometry allows better confinement of core concrete, resulting in enhanced resilience during strong ground shaking.Additionally, the review underscores the limitations of conventional force-based design methods in accurately predicting the inelastic behavior of structures, especially those with irregular geometries. Nonlinear static (pushover) analysis, displacement-based approaches, and accurate modeling of infill-wall interaction emerge as essential tools for realistic seismic performance evaluation.The findings of this study support the strategic replacement or incorporation of circular columns in RC frames to improve seismic resistance, particularly in retrofitting and performance-based design scenarios. While practical challenges such as increased formwork complexity exist, the benefits in structural safety and energy absorption justify their use. This paper serves as a reference for engineers and researchers aiming to optimize RC structures for improved seismic resilience through column geometry selection.
DOI: https://doi.org/10.5281/zenodo.16728432
Farm-to-Film: Turning Wheat And Rice Straw Into Sustainable Bioplastics_905
Authors: Tushar Sharma, Dr. Rakesh Kumar
Abstract: Plastic pollution continues to pose a severe environmental threat worldwide. Simultaneously, the common practice of burning wheat and rice straw in agricultural fields leads to hazardous air quality and soil degradation. This study introduces a novel, sustainable alternative: transforming crop residues into biodegradable nanocellulose bioplastics. Through enzyme-assisted extraction, cellulose from straw can be processed into high-quality packaging material. This innovation not only curbs plastic pollution but also reduces air pollution and provides farmers with an alternative income stream. The paper emphasizes the need for policy support, industry collaboration, and rural engagement to scale this solution.
Evaluating Learning Analytics Usability Factors Towards Learner Performance Assessment In Virtual Environment In Kenyan Universities
Authors: Mohammed Swaleh Mohammed, Bostley Muyembe Asenahabi, Alice Nambiro, Eric Sifuna
Abstract: The purpose of the study was to evaluate the learning analytics usability factors towards e-learning learner performance assessment in Kenyan Universities. The study used quantitative methodology toward achieving the purpose of the study. Quantitative approach was attained through using five- point Likert scale distributed through random sampling to eight universities in Kenya. A focus on those students using e-learning whether blended or virtual learning. The findings revealed two factors: Perceived Usefulness and Perceived Ease of Use.
The Evolution And Strategic Imperative Of HR: Toward A New Model Of Organizational Impact Measurement
Authors: Assistant Professor Dr. Atul Kumar, Professor Dr. Vinit Kumar Sharma
Abstract: Over the course of the last hundred years, the Human Resources (HR) profession has undergone a profound evolution. What was once regarded primarily as an administrative and compliance-driven function has steadily developed into a critical strategic component within modern organizations. Initially tasked with duties such as payroll management, employee record-keeping, and enforcement of labor laws, HR has now assumed a broader and more influential role—one that encompasses talent management, leadership development, organizational culture, and workforce planning aligned with business objectives. Despite this strategic repositioning, a significant challenge continues to limit the credibility and impact of the HR function: the absence of comprehensive and reliable mechanisms for measuring its true contribution to organizational success. Traditional HR metrics, while useful for tracking operational efficiency (e.g., turnover rates, time-to-hire, or training hours), are often insufficient in demonstrating the extent to which HR initiatives support or drive strategic outcomes. As businesses face increasing pressure to quantify return on investment across all departments, HR must develop more sophisticated tools to validate its role as a value-adding partner. This paper addresses this critical issue by examining the limitations of conventional HR measurement systems and emphasizing the need for a more integrative and multidimensional approach. Drawing upon both theoretical frameworks and empirical evidence, the study proposes a comprehensive model designed to assess the influence of HR on organizational performance. This model goes beyond input-output measures and includes three core dimensions: people-related outcomes (such as employee engagement and leadership effectiveness), process efficiencies (such as the agility of HR interventions and innovation in talent practices), and performance metrics (including business growth, productivity, and customer impact). By capturing the complex interplay between human capital and business execution, this proposed framework aims to provide organizations with a practical and evidence-based method to assess the strategic value of their HR functions. Ultimately, the goal is to support HR’s continued evolution into a fully integrated and analytically grounded contributor to sustainable business success.
Music Experience Center
Authors: Harshita Bissa, Malini Nathe, Ar. Radhika Raut
Abstract: The Music Experience Centre is a pioneering cultural and educational initiative designed to serve as a vibrant hub where individuals can explore, understand, and engage with music in all its forms. Blending tradition with innovation, the center offers an immersive environment where the physical, emotional, historical, and technological dimensions of music intersect to inspire creativity, foster learning, and promote community engagement. At its core, the center functions as an interactive space that demystifies the world of music for both musicians and non-musicians alike. It features state-of-the-art sound installations, digital composition labs, instrument galleries, and acoustic exploration zones that allow visitors to physically and sonically engage with music. Visitors can experiment with traditional and modern instruments, understand the physics of sound, compose music using AI-assisted software, and explore diverse musical cultures through multimedia exhibits. In addition to its interactive exhibits, the center will offer a wide range of programs, including music therapy sessions, artist residencies, live performances, educational workshops, and school outreach programs. These initiatives aim to make music accessible and meaningful across age groups and cultural backgrounds, with special attention to inclusivity, neurodiversity, and mental well-being. The Music Experience Centre also serves as a research and innovation platform, encouraging collaboration between musicians, educators, sound engineers, and technologists. Through its partnerships with academic institutions, cultural organizations, and the creative industries, it fosters dialogue and experimentation that push the boundaries of how music is created, experienced, and understood. Ultimately, the Music Experience Centre envisions a world where music is not just heard but felt—a tool for expression, connection, education, and healing. It stands as a landmark destination for music lovers, innovators, and communities, offering a transformative experience that resonates long after each visit.
DOI:
Comparative Structural Performance Analysis Of Residential Buildings With Constant Floor Area And Varying Plan Shapes Using ETAB
Authors:
Abstract: Earthquakes are among the most destructive natural hazards, often leading to substantial structural damage, human casualties, and economic disruption. These events are caused by sudden energy release within the Earth’s crust, generating seismic waves that impact large geographic regions. Past incidents, such as the 2015 Nepal earthquake, have underscored the importance of seismic-resistant design, especially in vulnerable areas. This study presents a comparative structural performance analysis of high-rise residential buildings (G+20) with constant floor area but varying plan shapes, evaluated using ETABS software. The analysis focuses on four distinct geometric configurations and is conducted as per the seismic design provisions outlined in IS 1893 (Part 1): 2002, considering Zone III conditions specific to the Betul region of India. Key parameters assessed include, Seismic force distribution, Bending moments, Lateral displacements, Construction cost implications The objective is to determine the most efficient plan configuration that offers an optimal balance between structural performance and economic viability under seismic loading. The findings aim to guide architects and engineers in adopting geometry-driven solutions that improve seismic resilience without increasing the building footprint.
DOI: http://doi.org/10.5281/zenodo.16737144
Comparative Structural Performance Analysis Of Residential Buildings With Constant Floor Area And Varying Plan Shapes Using ETABS: A Review
Authors: RAMMOHAN SAINI, ANKITA SINGHAI, RAHUL KUMAR SATBHAIYA
Abstract: The geometric configuration of a building plays a pivotal role in determining its seismic response. As urban environments demand both architectural creativity and structural safety, understanding how plan shape affects performance under seismic loading is increasingly critical. This review paper synthesizes existing research focused on the comparative structural behavior of residential buildings with identical floor areas but varying plan geometries—including Rectangular, Square, Triangular, and Circular configurations. Analytical studies using ETABS, a widely adopted structural analysis and design software, form the basis of this comparison, with performance metrics such as base shear and lateral displacement under seismic loads examined. Findings across multiple studies suggest that geometry significantly influences seismic resilience, with triangular plans often demonstrating better base shear resistance, while regular shapes such as squares perform more uniformly in terms of displacement control. These insights reinforce the necessity of integrating geometric considerations into the early stages of structural design. The review also highlights the limitations of current linear static analyses and emphasizes the need for future research involving nonlinear dynamic methods, soil-structure interaction, and irregular high-rise forms. Ultimately, this paper contributes to the evolving discourse on shape-based seismic optimization and supports the development of safer, code-compliant residential structures in seismically active regions.
DOI: https://zenodo.org/uploads/16750644
Economic Growth & Digital Payment Transactions In India
Authors: Dr Ramesh
Abstract: One of the fastest-growing industries in India is digital payments, such as UPI Payments, which banks are concentrating on for financial inclusion. Digital payments have the potential to guarantee rapid economic growth. By ensuring that bank account transactions are thoroughly examined, becoming cashless will reduce the bribery system. By following the proper processes, this replacement ails gradually eliminate paper money. The study also looks at how digital payment transactions affect e-commerce, retail, banking, and other areas of the Indian economy.
DOI: http://doi.org/
Study On Use Of Recycled Construction And Demolition Waste In Structural Applications: A Life-Cycle And Performance-Based Evaluation
Authors: Dr Balaji Shivaji Pasare
Abstract: Background: Rural areas such as Osmanabad are plagued by sustained degradation of infrastructure driven by climate variation, material fatigue, and restricted maintenance capabilities. Structural concrete, while strong, is susceptible to microcracking that can expedite deterioration and impede long-term resilience. Fostered by the recent advent of self-healing technologies, namely bio-concrete and polymer-fused conduit systems, these represent adequate solutions for low or no-maintenance, climate-responsive, and adaptable buildings in impoverished areas. Objectives: The current study is set forth with the objectives of field performance, healing potential, and stakeholders’ acceptance for the bio-concrete and polymer-based self-healing concrete (SHC) under semi-arid conditions in Osmanabad. It aims at the transition from laboratory innovation to rural deployment conditions, highlighting humanised engineering and participatory validation. Methods: A mixed-method design of experimental trials in 3-gram panchayats was implemented with stakeholder involvement. Quantitative data were compressive strengths, crack closure rates, and environmental (humidity, temperature) correlation variables. Qualitative information was obtained through interviews, focus groups, and participatory observation. The analytic techniques used were ANOVA, regression modelling, and thematic coding of the data. Results: Bio-concrete exhibited enhanced crack healing (94% recovery) and strength increases (~20% compared to control mixes), especially in high-humidity zones. Positive correlations were found between healing rates and environmental humidity. The highest level of trust from the stakeholders was observed from the farmers (avg. rating: 9.1/10), noting that it helped minimize maintenance and seemed to heal. The polymer SHC had milder performances, though lower photo-hysteresis. Conclusion: Bio-concrete is found to be a socially and technically acceptable, climate change resilient alternative for rural infrastructure. The research confirms its relevance in Osmanabad and supports community-scale scaling. Through the convergence of high-performance materials and ethical, participatory adoption, this research configures a new model of resilient, humanised infrastructure for the disenfranchised.
DOI: https://doi.org/10.5281/zenodo.16753496
Economic Growth & Digital Payment Transactions In India
Authors: Dr Ramesh
Abstract: One of the fastest-growing industries in India is digital payments, such as UPI Payments, which banks are concentrating on for financial inclusion. Digital payments have the potential to guarantee rapid economic growth. By ensuring that bank account transactions are thoroughly examined, becoming cashless will reduce the bribery system. By following the proper processes, this replacement ails gradually eliminate paper money. The study also looks at how digital payment transactions affect e-commerce, retail, banking, and other areas of the Indian economy.
DOI: https://zenodo.org/uploads/16755158
Aquatic Trash Collector Bot
Authors: Ishan Deshpande, Mohit Jagtap, Akash Satras, Mrs. Shreyasi Watve
Abstract: india’s vibrant cultural traditions involve many water-based religious rituals, which unintentionally contribute to environmental pollution. During festivals such as Ganesh Visarjan and Kumbh Mela, water bodies like the Godavari River in Nashik often become heavily contaminated due to the disposal of idols, flowers, and plastic items. These pollutants accumulate on the water surface, disturbing the aquatic balance. To address this issue, we propose an eco- conscious solution—an automated bot system designed for surface waste collection. This bot uses renewable energy to operate and effectively removes plastic, debris, and water hyacinth from still water bodies, supporting a cleaner and more sustainable environment.
DOI: https://zenodo.org/uploads/16755368
The Algorithmic Republic: The Social Impacts Of AI In Singapore
Authors: Lionel Seah
Abstract: Artificial Intelligence (AI) is the field of study and development of computer systems capable of performing tasks that would typically require human intelligence. These tasks include reasoning, problem-solving, learning, understanding natural language, and perceiving the environment. As defined by John McCarthy, one of the pioneers of AI, “Artificial intelligence is the science and engineering of making intelligent machines, especially intelligent computer programs” (McCarthy, 2007). AI aims to replicate or simulate human cognitive functions in machines to enhance or automate decision-making, pattern recognition, and interactions. According to Stuart Russell and Peter Norvig, in their foundational textbook Artificial Intelligence: A Modern Approach, “AI is concerned with intelligent behaviour in artifacts” (Norvig, 2021). They categorise AI systems based on their capabilities to act and think rationally or humanly.
Comprehensive Review Of Structural Analysis Techniques For Gravity Dams Using STAAD.Pro Under Diverse Reservoir Conditions
Authors: Avdesh Kumar Ahirwar, Assistant Professor Murlidhar Chourasia, Rahul Kumar Satbhaiya
Abstract: Gravity dams, typically constructed using concrete or masonry, are massive hydraulic structures designed to resist external loads primarily through their own weight. These dams usually exhibit a triangular cross-sectional profile, with a wide base and a narrow crest, ensuring inherent stability against hydrostatic and seismic forces. Accurate analysis of such structures is essential for ensuring their safety and performance under varying reservoir conditions.This study presents a detailed evaluation of gravity dam behavior using STAAD.Pro, widely adopted structural analysis software. While traditionally employed for designing framed structures such as buildings, STAAD.Pro is also capable of modeling complex elements including plates, shells, and solid components. This flexibility makes it suitable for simulating gravity dams under different loading scenarios.In this analysis, the dam is modeled using solid elements to accurately represent its mass and geometry. The effects of hydrostatic pressure, uplift pressure, and other reservoir-induced forces are incorporated to simulate real-world conditions. By leveraging the computational capabilities of STAAD.Pro, stress distribution, deformation profiles, and stability parameters of the dam are systematically investigated. This approach eliminates the limitations of manual analysis, offering a time-efficient and precise method to assess dam safety under diverse operational conditions.
DOI: https://doi.org/10.5281/zenodo.16759714
Assessment Of Climate Change Impacts On Water Resources In Arid And Semi-Arid Regions: A Case Study Of Hargeisa, Somaliland
Authors: Ahmed Abshir Ahmed
Abstract: This study investigates the impacts of climate change on water resources in arid and semi-arid regions, with a focused case study on Hargeisa, Somaliland. As climate change intensifies hydrological variability, regions already facing environmental and demographic pressures experience exacerbated water scarcity. Analyzing climatic data (1991-2020) from SWALIM and regional groundwater studies, we identify three critical trends: (1) decreasing rainfall reliability (annual averages of 250-400mm with high interannual variability), (2) increasing evapotranspiration rates (up to 2100mm annually), and (3) progressive groundwater quality deterioration (TDS >1g/L in 90% of sampled wells). Our findings demonstrate particular vulnerability in shallow aquifers – the primary water source for most communities – which face both seasonal depletion and contamination risks. The research reveals a dual stressor system where climate variability interacts with rapid population growth to threaten water security. We propose four key interventions: (i) enhanced hydro-climatic monitoring networks, (ii) integrated climate adaptation planning, (iii) targeted aquifer recharge strategies, and (iv) policy reforms for sustainable groundwater governance. These recommendations provide actionable pathways for building resilience in water-stressed regions facing escalating climate risks
DOI: https://doi.org/10.5281/zenodo.16791946
Comprehensive Structural Analysis Of Gravity Dams: Evaluating Performance Under Full, Empty, And Partial Reservoir Conditions Via STAAD.Pro
Authors: Avdesh Kumar Ahirwar, Murlidhar Chourasia, Rahul Kumar Satbhaiya
Abstract: Gravity dams are essential infrastructural elements used for water storage, flood control, and hydropower generation. Their structural safety and performance under varying loading and reservoir conditions are critical due to the potential risks associated with failure. This paper presents a comprehensive review of structural analysis techniques for gravity dams, with a specific focus on modeling through STAAD.Pro software. The study highlights the application of finite element methods (FEM) in simulating dam behavior under different forces, including hydrostatic pressure, uplift pressure, seismic effects, and self-weight. The role of STAAD.Pro as a versatile tool capable of handling complex geometries and material properties is explored in detail. Emphasis is placed on its ability to model solid elements and perform static and dynamic analyses in accordance with IS 6512:1984 and IS 875 standards. The review synthesizes results from multiple case studies and simulations, examining factors such as stress distribution, displacement, sliding resistance, overturning moments, and shear friction parameters. Findings indicate that STAAD.Pro provides accurate and reliable predictions for assessing dam safety under full and partial reservoir conditions. Furthermore, the study identifies common stress concentration zones—particularly at the heel—and discusses reinforcement strategies to mitigate structural vulnerability. It also evaluates the factor of safety under various loading combinations and confirms compliance with national safety codes. This paper contributes to the evolving methodology of dam design and evaluation, offering valuable insights for engineers, researchers, and policymakers aiming to enhance the resilience of gravity dams through advanced numerical modeling.
Linking E-Learning Effectiveness With Employee Performance Metrics
Authors: Ms. Shruti Rawat, Manasvee Jain
Abstract: – In today’s fast-paced and digitally driven work environment, organizations are increasingly relying on e-learning platforms for employee training and development. While digital training offers flexibility, scalability, and cost-efficiency, its actual impact on employee performance remains a critical area of inquiry. This study explores the relationship between e-learning effectiveness and employee performance metrics, aiming to bridge the gap between training delivery and measurable workplace outcomes. By analyzing data from employees across multiple departments in a mid-sized technology company, the research examines how engagement with online training modules correlates with key performance indicators such as task accuracy, productivity levels, customer satisfaction scores, and overall goal completion rates. A combination of pre-training and post-training performance data, user feedback, and system usage logs were used to assess the tangible benefits of e-learning initiatives. The findings suggest a positive link between well-structured e-learning programs and improved employee performance, particularly when courses are interactive, aligned with job roles, and supported by timely feedback. However, the study also highlights that the effectiveness of digital training is influenced by factors such as learner motivation, management support, and course design quality. This research underscores the importance of integrating performance metrics into e-learning evaluations to ensure learning investments translate into real-world results. The insights provided can help HR professionals, training managers, and organizational leaders refine their digital learning strategies for maximum impact.
DOI: http://doi.org/10.5281/zenodo.16792612
A Review of Role Of Machine Learning in Designing of Proposed Ransomware Detection Technique
Authors: Mr. Kartik, Dr. Bijendra Singh, Dr. Kavita
Abstract: This research aims to analyze and design an effective ransomware detection technique using machine learning algorithms. The study explores various ML approaches—such as classification, anomaly detection, and clustering—and evaluates their performance in identifying ransomware from normal and benign system behavior. Key features, such as file access patterns, process activities, and network communication, are extracted and analyzed to train and test ML models capable of early detection with high accuracy and low false positives. The primary aims of this study are to understand the behavioral characteristics of ransomware attacks; Identify and select relevant features for effective detection; Evaluate different machine learning models based on precision, recall, F1-score, and accuracy; and Propose a novel or improved ML-based detection framework tailored for real-time ransomware threat identification. This research contributes to the ongoing efforts to fortify cybersecurity by presenting a data-driven, machine learning-powered methodology that enhances early detection capabilities, thereby reducing potential damage and enabling quicker incident response.
DOI:
A Review On IoT-Integrated Artificial Intelligence For Smart Irrigation Systems: Trends, Technologies, And Challenges
Authors: Nitish Sharma, Dr Komal Garg
Abstract: The rising demand for food worldwide and declining freshwater resources have intensified the need for efficient agricultural practices. Smart irrigation, powered by the integration of the Internet of Things (IoT) and Artificial Intelligence (AI), offers a transformative solution for precise and sustainable water management. This review examines current advancements in smart irrigation technologies, concentrating on IoT-enabled sensor networks, real-time environmental monitoring, and AI-powered judgment models such as machine learning and predictive analytics. It examines the evolution among these technologies, their application in farming with accuracy, and the synergetic benefits of combining IoT and AI. Moreover, the review highlights implementation challenges, including high costs, energy constraints, data security, and region-specific limitations. The paper concludes with future research directions aimed at enhancing system efficiency, adaptability, and accessibility, especially in water-scarce and developing regions.
DOI: http://doi.org/10.5281/zenodo.16810519
Next-Generation Materials For High-Performance Flexible Antennas: A Comprehensive Review
Authors: Dhrisya S. Anil1, Dr Abhilash S. Vasu2
Abstract: Flexible antennas are key components in modern wireless systems, valued for their lightweight design, adaptability, and integration with non-planar surfaces. They are categorized into four main types: conformal antennas, which adapt to curved structures for aerodynamic and stealth uses; wearable or textile antennas, integrated into clothing or body-worn devices for healthcare, military, and sports; reconfigurable antennas, which dynamically adjust frequency, radiation pattern, or polarization; and fluidic or movable architectures, utilizing liquid metals or mechanical actuation for tunability and shape adaptability. Material selection significantly influences both mechanical flexibility and electromagnetic performance. Traditional conductors like copper and silver offer high conductivity but require special techniques for flexibility. Conductive polymers and composites combine electrical performance with mechanical compliance and environmental resistance. Textile-based conductors integrate antennas directly into fabrics for comfort and durability. Advanced flexible substrates such as polyimide, PDMS, LCP, and TPU provide low dielectric loss and resilience under stress. This review outlines classification, materials, and fabrication advances, emphasizing their role in enabling next-generation communication technologies like 5G/6G, IoT devices, aerospace systems, and wearable healthcare solutions. Flexible antennas promise compact, unobtrusive, and high-performance wireless connectivity for future applications.
DOI: https://doi.org/10.5281/zenodo.16869714
Student-Alumni Platform
Authors: Moushami D, Rachna V, Srinidhi B
Abstract: The Student-Alumni Platform is an educational community designed to develop connections between students alumni of an esteemed institution. It involves the current students and passed out alumni of the institution. This platform is helpful to make the students network, seek mentorship and have access to development resources which help the student to grow and learn. The alumni can offer guidance,share job opportunities,and contribute to their institution and students.By using this platform, the gap between student and alumni is solved and provides a seamless and interactive interface. The necessary features that are included in the platform are creating user profile, login, student dashboard, alumni dashboard with sending connection requests and messages. The platform is built by using HTML,CSS and JavaScript.As the platform evolves, future enhancements will further enrich user engagement by using more enhanced features like advanced messaging and group chats, linkedin integration,mobile access,and more features to help make the application more dynamic and dependant to the users.It also helps in building life long relationships by promoting the connections between student and alumni.
Strategic Challenges And Solutions In Implementing AI And IoT For Green Tech Adoption In The IT Supply Chain Ecosystem
Authors: Viraj P. Tathavadekar
Abstract: Combining Internet of Things (IoT) and artificial intelligence (AI) technologies in green technology adoption within IT supply chain ecosystems presents both unprecedented opportunities and complex challenges. This research investigates the strategic barriers, implementation solutions, and performance outcomes of AI-IoT integration for sustainable IT supply chain management. Through quantitative analysis of 350 IT companies across different maturity levels, this study examines the relationships between technological readiness, implementation challenges, and green technology adoption success. The findings reveal significant correlations between AI-IoT integration levels and sustainability performance metrics, while identifying critical success factors for overcoming implementation barriers. The research is one of the efforts to provide a body of literature on digital transformation in sustainable supply chain management, and offers empirical advice to practitioners and policy makers.
Fuzzy Modelling Techniques For Real Life Applications Via Data Analytics
Authors: Dr. V.Vijayalakshmi, Dr. D. Sridevi, L. Mohan, N.Sundarakannan
Abstract: In Genomic analysis, a comprehensive theoretical study is the need of the hour. Genomic analysis is an elaborate network that comprises various, expendable data. Such data has to be organized leading to comprehensible, practical data which is used to extricate critical information. A feasible procedure called SILE (Search, Identification, Load and Exploitation), is applied to assimilated genomic data with appropriate alterations that is associated with a certain disease. The principal aim is to propose natural drugs for treatment to those patients diagnosed with the disease and this is achieved by substantiating genes that have mutated. In this research, triangular fuzzy numbers are incorporated by considering three symptoms for examining the genes.
Trends And Techniques In Recommendation Systems : A Survey
Authors: Labdhi Jain, Rajesh Dhakad Associate Professor
Abstract: In the current digital era, the volume of data produced every second is staggering, making it challenging for users to find relevant information. Recommendation systems utilize extensive data and data mining techniques to analyze large amounts of data and provide accurate, personalized sug- gestions. Recommendation systems are information filter- ing systems that provide particular suggestions for items that are most pertinent to a particular user or a group of users. The algorithms and methods used for recommender systems are Content-Based Filtering, Collaborative Filtering, and Hybrid Methods. Recommendation systems include diverse applications and domains such as books, e-commerce ser- vices, social network services, movies, and tourism services. Key evaluation metrics of different recommender systems are discussed to provide insights into the assessment of mod- els and the optimization of their performance. Globally, recommendation systems have become important. The pur- pose of this paper is to include and give knowledge of each method, from a traditional-based recommendation system to a deep learning-based recommendation system. By synthe- sizing current trends, challenges, and future research direc- tions, this paper offers a comprehensive understanding of the recommendation system for both researchers and industry professionals.
DOI: http://doi.org/
Smart Iot Driven Wearable Safety Band.
Authors: Assistant professor R .Saranya , Naviya Ka, Brindhavini P, Nisha L
Abstract: – she-shield Designed to enhance personal safety for women, the wearable device offers a dual solution by combining a device integrates several critical sensors to enhance personal security and wearable IoT device developed on a Raspberry Pi platform, integrating a range of sensors ,The temperature and heartbeat sensors monitor physiological changes that may indicate distress or health issues .In active mode, the touch sensor allows the user to manually activate the device during an emergency,initiating a loud buzzer and alarm to deter attackers. The device is powered by a rechargeable lithium battery, ensuring long-lasting performance and reliability , The voice detection sensor uses a Support Vector Machine (SVM) algorithm to identify both general vocal patterns and distress screams, ensuring that the device can accurately recognize a woman’s voice in various contexts.the built-in GPS provides real-time location tracking,Combined with real-time streaming, video storage, and alert messaging, the device employs a dualalarm system: one alarm is emitted from the device itself second alert is sent via i
Artificial Intelligence In Education – Transforming Higher Education In India
Authors: Pratik Nikam, Dr. Priyanka Singh
Abstract: Artificial intelligence (AI) is transforming higher education in India by enabling personalized learning, enhancing student engagement, and providing educators with data-driven tools to optimize teaching. This paper explores AI’s potential to create adaptive learning environments, improve accessibility, and foster holistic student development. Through AI-powered platforms, virtual tutors, and analytics, education is becoming more inclusive and efficient. However, challenges like ethical concerns, data privacy, and equitable access must be addressed to ensure responsible adoption. This study advocates for a future where AI enhances learning outcomes while maintaining fairness and inclusivity, preparing students for a dynamic world.
Machine Learning In Prediction Of Fuel Efficiency In The Automotive Industry
Authors: Aviichal Sharma
Abstract: This study explores how machine learning algorithms can help to increase fuel efficiency. The implemented model is trained based on a dataset that consists of many features and attributes affecting a vehicle’s fuel efficiency such as MPG, Number of cylinders, Horsepower, Vehicle weight, and many more. For training the model, many machine learning models that fit the dataset variables were studied and implemented. It was found that the Random Forest Regression technique performed better than other algorithms in predicting fuel economy after extensive testing and analysis. It was the most appropriate algorithm for my research goal because of its capacity to manage intricate interactions between the input variables and accurately anticipate fuel usage. Random Forest Regression was demonstrated to be a potent approach to improving fuel economy prediction accuracy by utilizing the ensemble of decision trees and feature unpredictability.This study’s conclusion emphasizes the enormous potential of machine learning for enhancing fuel efficiency in the automotive sector. It was determined that Random Forest Regression is the best technique for forecasting fuel efficiency after investigation. It paved the path for improvements in resource optimization and environmental sustainability by taking into account several important criteria and investigating alternative algorithms. The objective is to encourage industry leaders to use machine learning as a catalyst for change, advancing the automobile industry toward a greener and more effective future.
Megawatt Level Electric Vehicle Charging Station
Authors: Allagadapa Bharath, Deshaboina Sumanth
Abstract: This paper presents a comprehensive strategy for the efficient management of electric vehicle charging stations (EVCSs) integrated with a grid-side Modular Multilevel Converter (MMC) interface. The MMC topology is selected for its ability to directly connect to medium voltage grids without transformers, offering benefits such as reduced space and material usage. A key challenge addressed is the unbalanced load distribution caused by varying charging demands and random EV arrival-departure patterns, which can lead to significant disparities among MMC arms and internal modules. To ensure balanced and sinusoidal grid currents as well as stable module voltages, the proposed method integrates a Load Management (LM) algorithm with a Power Flow Management (PFM) algorithm. The LM algorithm schedules and allocates EV charging sessions to reduce phase and arm-level load variations, while the PFM regulates circulating currents to mitigate any remaining unbalances. Simulation results on a realistic scenario—such as a shopping mall EVCS—demonstrate that the proposed LM-PFM approach significantly enhances energy delivery and system stability, outperforming conventional rule-based methods. Real-time simulations further validate the feasibility of the proposed solution for practical deployment.
DOI:
F-HSRP: A Federated, Trust-Aware, And Energy-Efficient Secure Routing Protocol For Scalable And Privacy-Preserving IoT Networks
Authors: Piyali Ghosh , Dr. Dhirendra Kumar Tripathi
Abstract: With the accelerated growth of the Internet of Things (IoT), providing secure, scalable, and privacy-preserving communication has become a serious issue. Current routing protocols such as AODV, DSR, and HSRP are incapable of addressing the complex needs of today’s IoT systems, particularly in large-scale, heterogeneous, and energy-constrained systems. This paper introduces the Federated Hybrid Secure Routing Protocol (F-HSRP)—a new paradigm that combines federated learning, trust-based routing, AES-256 encryption, and blockchain-aided route verification to address these issues holistically. F-HSRP utilizes a light-weight Convolutional Neural Network (CNN) at the edge of IoT networks. With federated learning, local anomaly detection models are trained at the nodes, maintaining data privacy and facilitating real-time accurate threat detection. Routing decisions are informed through a composite trust score based on node behavior, residual energy, and anomaly scores. Secure data transfer is enabled by AES-256 encryption and a lightweight Proof-of-Authority (PoA) blockchain process that guarantees tamper-proof route verification without imposing substantial overhead. The protocol is tested with the Bot-IoT dataset and a hybrid simulation platform that integrates NS-3 and TensorFlow Federated. The results indicate F-HSRP outperforms conventional protocols with 96.3% anomaly detection rate, 27% energy efficiency improvement, and better packet delivery and delay metrics. It also successfully fights blackhole, replay, and Sybil attacks. By integrating federated intelligence, cryptographic security, and blockchain consensus, F-HSRP offers a strong, energy-efficient, and privacy-enhanced routing solution for real-time IoT applications in smart cities, industrial control, healthcare monitoring, and military systems.
The Impact Of AI On Back-Office Logistic Operations And Logistic Shared Service Operations: 6 Key Impacts In 2025
Authors: Sandipan Chakraborty
Abstract: Back-office logistics and shared services are being re-architected by AI in 2025. Beyond warehouse robots and route optimizers, the largest productivity lift is happening in the “paperwork and pixels” of logistics: order capture, document processing, freight audit & pay, customer service, and compliance. Drawing on current India-market data and public programs (ULIP, GST e-invoicing, ONDC) and global benchmarks (World Bank LPI), this journal synthesizes six concrete AI impacts that leaders can deploy now: (1) intelligent document processing and touchless workflows; (2) predictive ETA and exception control towers; (3) dynamic rating, tendering, and contract optimization; (4) forecasting for capacity, working capital, and SLA staffing; (5) AI copilots for shared-services agents; and (6) digital compliance across GST/e-invoicing and trade. We quantify the opportunity, map enabling Indian rails/APIs, list risks and controls, and close with a practical upskilling and tools roadmap tailored to India..
Advances In Dealing With Long-Term Dependencies: From Vanishing Gradients To Transformer Architectures And Beyond
Authors: Poroni Koiknzi Fousseni, Elvis Thierry Sounna Vofo
Abstract: Long-term dependency modeling remains one of the fundamental challenges in sequence processing tasks across natural language processing, time series analysis, and sequential decision-making. This paper presents a comprehensive analysis of methods for handling long-term dependencies, examining the evolution from traditional recurrent neural networks (RNNs) to modern attention-based architectures. We provide theoretical foundations for the vanishing gradient problem, analyze key architectural innovations including Long Short- Term Memory (LSTM), Gated Recurrent Units (GRU), and Transformer models, and dis- cuss emerging approaches such as State Space Models and Linear Attention mechanisms. Our analysis includes mathematical formulations, computational complexity considerations, and empirical performance comparisons across various sequence modeling tasks. We identify current limitations and propose future research directions for improving long-range sequence modeling capabilities in deep learning systems.
DOI: https://doi.org/10.5281/zenodo.16908505
Evaluating The Effectiveness Of Zero-Trust Architecture Principles In Reducing Cloud-Based Authentication Threats & Vulnerabilities
Authors: Victor Otieno Mony, Anselemo Peters Ikoha, Roselida O. Maroko
Abstract: The increasing complexity of cyber threats and the widespread adoption of cloud-based services have significantly exposed traditional authentication mechanisms to evolving vulnerabilities. To try and reduce the veracity of these threats, several mitigation mechanisms such as Multifactor and Two Factor Authentication, Biometric Authentication, Key Hashing Protocols, among others, have been employed. However, existing mitigation strategies have proven insufficient in addressing the dynamic nature of CBS authentication threats and vulnerabilities. In response, this paper looks at alternative, better cloud-based authentication mitigation mechanisms through the adoption of Zero Trust Architecture paradigms. The Paper evaluates the five Zero Trust Principles against five cloud-based authentication attack vectors for effectiveness in reducing cloud-based threats and vulnerabilities. The cloud-based authentication-related Zero Trust principles evaluated by this paper are the principles of Least Privilege, Continuous Monitoring, Encryption, Strong Authentication, and Policy Enforcement. The five authentication threat categories whose attack vectors have been used in the evaluation process are Brute Force Attacks, Denial of Service Attacks, Social Engineering Attacks, Man-in-the-Middle Attacks, and Password Discovery Attacks. The evaluation process involves analysing the ZTA principles against the five authentication threats and vulnerabilities attack vectors to determine effectiveness. The results the evaluation indicate that the ZTA principle of Policy Enforcement has the broadest impact across all five threat categories, while the other evaluated Zero Trust principles offer only partial mitigation to cloud-based authentication threats. This is because the Zero Trust principle of Policy Enforcement has a deeper, comprehensive coverage across the selected threat vectors and encompasses a higher number of Zero Trust sub-principles. The paper thus concludes that the Zero Trust principle of policy enforcement is the most suitable foundation for designing a threat-responsive ZTA implementation scheme.
Quantitative and Ftir Spectroscopic Assessment of Phytochemical Constituents in Dacryodes Edulis Aqueous Leaf Extract
Authors: Ejiogu C. C., Ojiaku A. A, , Oguzie E.E, Njoku-Tony R. F.
Abstract: – Plants contain natural medicinal resources that are beneficial to human health and well-being and offer health benefits to populations worldwide. Dacryodes edulis was explored for its antioxidant activity, quantitative determination and spectroscopic analysis of its phytochemical constituents of its crude leaf extract using Fourier Transform Infrared Spectroscopy (FTIR). The leaf extract was found to have good antioxidant activity (61.84%) using DPPH. It was also observed to contain absorption peaks at 2918.26 cm-1, 2850.24 cm-1, 1725. 98 cm-1, 1517.08 cm-1, 1164.61 cm-1 and 1032.79 cm-1 representing carboxylic, aliphatic groups, carbonyl, ester, phenolic compounds, tertiary alcohols and amino acids functional groups. This is evidenced by the presence of tannins, flavonoids, alkaloids, cyanogenic glycosides and anthraquinones. These functional groups are responsible for the therapeutic potential of the fruit tree and use in the treatment of acute and chronic infections and diseases.
DOI: https://doi.org/10.5281/zenodo.16918227
Cardiogenic Disease Using Ml
Authors: Ass.Prof. Srinivas V, Chethan Kumar B AAbstract: Cardiogenic shock (CS) represents one of the most critical and life-threatening complications of cardiovascular disease, arising when the heart is unable to circulate sufficient blood to meet the body’s metabolic needs. It frequently develops as a consequence of acute myocardial infarction, acute decompensated heart failure, or advanced cardiomyopathy. Even with progress in modern critical care, CS remains linked to exceptionally high mortality rates—often surpassing 40–50% in cases related to acute coronary syndromes. The sudden onset, rapid physiological decline, and diverse clinical presentations make timely recognition and accurate risk assessment particularly challenging. Conventional diagnostic tools, though indispensable, often lack the precision and speed needed to initiate intervention before irreversible damage occurs. In recent years, the adoption of Machine Learning (ML) techniques in cardiology has emerged as a promising avenue to address these limitations. ML can process extensive datasets from electronic health records (EHR), continuous monitoring systems, and imaging modalities, uncovering patterns that may be imperceptible to human observation. By analyzing structured and unstructured information—such as laboratory results, hemodynamic parameters, ECG data, and clinician notes—ML models can detect early warning signals, classify patient subgroups, and forecast outcomes with notable accuracy. Studies have demonstrated the value of predictive algorithms, such as gradient boosting methods like XGBoost, trained on multi-year de-identified hospital datasets. These models have been able to anticipate CS onset several hours before formal diagnosis, achieving area-under-the-curve (AUC) scores near 0.90. They can notify healthcare providers while patients are still in the emergency department, intensive care unit, or general ward, enabling earlier interventions. Deep learning approaches—including convolutional and recurrent neural networks—have powered systems like “CShock,” which processes real-time patient data. Such systems have consistently outperformed traditional scoring tools, including the CardShock score, in predicting both occurrence and severity. Unsupervised learning techniques, such as clustering, have also been applied to categorize CS patients into distinct phenotypes based on physiological and biochemical profiles. This patient segmentation is essential, as it highlights variations in treatment responses and supports the development of personalized therapeutic strategies. Beyond detection, ML is being utilized for prognostication, including mortality and hospital readmission risk. National registry–based models have proven effective at predicting 7-day and 30-day readmissions, aiding in post-discharge planning and reducing the likelihood of recurrent hospitalization. Integrating time-series data from invasive arterial lines or wearable cardiac monitors further refines predictive capabilities by tracking evolving patient trends rather than relying solely on isolated measurements. The integration of Explainable AI (XAI) techniques—such as SHAP (Shapley Additive Explanations)—allows clinicians to identify which clinical features, like reduced systolic blood pressure, elevated lactate, or abnormal troponin levels, most strongly drive predictions, fostering greater trust in ML recommendations. Nonetheless, widespread clinical adoption faces obstacles. Data variability across institutions, stemming from differences in demographics, documentation standards, and care protocols, can limit model generalizability, underscoring the need for robust external validation. Moreover, many deep learning systems remain opaque “black boxes,” creating interpretability challenges in high-stakes decision-making. Ethical considerations—including data privacy, bias mitigation, and transparency—are equally critical. Future research will likely focus on multimodal ML systems that merge EHR data with imaging (e.g., echocardiography, cardiac MRI), genomic profiles, and continuous physiologic monitoring for a more comprehensive patient assessment. Adaptive learning models, which evolve alongside changes in treatment practices, could maintain accuracy over time. Implementation science will be crucial in integrating these tools into routine care without disrupting established workflows. Collaborative efforts among clinicians, data scientists, engineers, and industry partners will be needed to develop intuitive interfaces and ensure predictive insights are actionable at the bedside. Incorporating ML-driven decision support into telehealth and remote monitoring could further expand access to timely interventions in underserved regions. In summary, Machine Learning offers transformative potential for the early detection, classification, and management of cardiogenic shock. By enabling earlier action, supporting personalized care, and enhancing post-discharge outcomes, ML can markedly improve survival and recovery in this vulnerable population. However, its success will depend on rigorous validation, improved interpretability, strong ethical safeguards, and smooth integration into everyday clinical practice. As healthcare datasets grow in size and diversity, and computational capabilities advance, ML’s role in confronting the urgent challenges of cardiogenic shock is set to become increasingly vital.
Structural Performance Of Tall Buildings With Bracing And Infill Walls Under Lateral Loads: A Review
Authors: Rahul Kumar Satbhaiya, Jitendra
Abstract: High-rise buildings are particularly susceptible to lateral forces induced by seismic and wind loads, which often govern their overall performance and safety. Conventional reinforced concrete (RC) frames, although effective in carrying vertical loads, lack sufficient stiffness to resist such lateral actions, making them prone to excessive displacement, inter-story drift, and even structural instability. To address these challenges, lateral load resisting elements such as masonry infill walls and steel bracing systems are increasingly incorporated into RC frames to enhance seismic resistance. This paper presents a comparative study on the effectiveness of different lateral load resisting systems in improving the seismic performance of high-rise buildings. The investigation considers three structural configurations: bare frame, masonry infilled frame, and externally braced frame with X-bracing. A 12-story reinforced concrete building model (R+12) with 3 m floor height is analyzed using CSI ETABS software under seismic Zone V conditions and response spectrum analysis for soft soil. Key performance parameters including base shear, lateral displacement, inter-story drift, and moment distribution are evaluated to assess the influence of each system on overall seismic behavior. The results reveal that the inclusion of masonry infill significantly reduces the demands on beams and columns by increasing lateral stiffness and redistributing forces. However, the most notable improvement is observed with the use of external steel bracing, particularly X-braces, which provide superior stability and effectively minimize displacements. Steel-braced frames also demonstrate greater cost-efficiency compared to masonry infill, while ensuring higher ductility and energy dissipation. In comparison, bare frames exhibit the least stability and maximum lateral displacements. Overall, the study confirms that lateral load resisting elements play a crucial role in enhancing the seismic performance of RC frames, with steel bracing emerging as the most effective solution, followed by masonry infill..
DOI: https://doi.org/10.5281/zenodo.16925530
Optimizing Evaluation Processes With Comprehensive Metrics Like PSNR, AMBE, And F1 Score For Consistent Document Enhancement And Classification Performance
Authors: Santhosh SG, Sampath Kumar
Abstract: As digital data rapidly increases, there will be a corresponding depletion of textural data for various uses, as the number of image-based documents with usable text continues to rise. But too often, this is complicated by the obstacles of storing images with distortion, algorithmic font types, misaligned printed text, random text orientation and other forms of noise. For considering image-based documents with bilingual documents like government forms, educational transcripts, medical records, and business receipts with multiple integrated languages across a single document, become more complex and piled on these particular layers of challenges. The research paper “Optimizing Evaluation Metrics with PSNR, AMBE, and F1 Score to ensure Consistency in Document Improvement and Consistency in Classification Accuracy” is intended to examine some of the issues that both have been examined and expressed. This effort confronted the problems this work has discussed and expressed through consistent and reliable classification accuracy; AMBE determines the interference with the brightness distribution; and PSNR applies a clarity “score” to determine the clarity of an image. Combined, the three metrics present a possible framework to enhance the reliability and consistency of document processing system.
DOI:
A Smart PDF Query System For Efficient And Scalable Information Retrieval Using GenAI And Vector Databases
Authors: Vengadeshwaran B
Abstract: Conventional keyword-based search systems lack contextual understanding and often return irrelevant or incomplete results. In enterprise environments, this becomes a bottleneck when users attempt to extract precise information from large and complex documents. This paper introduces AskMyDoc, a scalable project management tool and smart document querying system that leverages semantic search, large language models (LLMs), and vector databases. Unlike traditional SQL databases susceptible to human errors during updates, AskMyDoc processes and indexes documents using embedding techniques and retrieves answers using generative AI, thereby ensuring accuracy, speed, and consistency. The system is built with LangChain, FAISS, Sentence Transformers, and OpenAI’s GPT models, supporting real-time natural language querying even across gigabyte-scale documents.
Theoretical Investigation of Deformed Nuclei: Impacts on Nuclear Stability and Excitation Phenomena
Authors: Suresh kumar, Dr Vandana
Abstract: Deformed nuclei, characterized by deviations from spherical symmetry, exhibit unique structural properties that are critical to understanding nuclear stability, reaction dynamics, and excitation phenomena. This theoretical study investigates the structural properties of selected deformed nuclei using advanced nuclear models and computational approaches. Employing Density Functional Theory (DFT) with Skyrme and Gogny interactions, alongside Hartree-Fock-Bogoliubov (HFB) calculations, the research analyses deformation effects on nuclear binding energy, charge distributions, and level densities. Transitional and neutron-rich nuclei are emphasized to explore the evolution of deformation, triaxiality, and nuclear softness. The results reveal significant impacts of deformation on nuclear moment of inertia and energy spectra, particularly in rare-earth and actinide regions. The inclusion of triaxiality further enhances the accuracy of predictions for level densities and excitation spectra. Comparisons with experimental data from gamma-ray spectroscopy and Coulomb excitation validate the robustness of the theoretical frameworks employed. This study addresses key gaps in understanding nuclear deformation, particularly for isotopic chains near the neutron drip line and transitional regions. The findings provide critical insights for refining existing nuclear models and guiding future experimental investigations. Furthermore, this research highlights the importance of incorporating pairing correlations and deformation effects to predict properties of nuclei far from stability. The study contributes to the broader understanding of nuclear structure and its applications in nuclear energy, astrophysics, and particle physics. These findings underscore the role of theoretical models in complementing experimental efforts and advancing nuclear physics research.
DOI: http://doi.org/10.5281/zenodo.16940412
Impact Of Nuclear Deformation On Structural Parameters, Energy Levels, And Quadrupole Moments Of 152Sm, 238U, And 240Pu Nuclei
Authors: Suresh kumar, Dr Vandana, Shashikant Sheoran, Vandana Mahlawat
Abstract: This study explores the structural characteristics of selected deformed nuclei using theoretical frameworks, focusing on their shapes, energy levels, and quadrupole moments. Nuclear deformation results from the complex interplay between shell effects and the strong nuclear force, causing deviations from spherical symmetry. Advanced models, including the Nilsson model, Hartree-Fock-Bogoliubov (HFB) theory, and collective models, are employed to examine the influence of deformation on nuclear structure. The key findings emphasize the role of deformation in shaping rotational spectra and intrinsic quadrupole moments, with significant results for nuclei such as 152Sm, 238U, and 240Pu. Calculations of quadrupole deformation parameters and energy levels show strong agreement with experimental data, validating the theoretical approaches. The study also investigates the stabilization of heavy nuclei through deformation, which redistributes charge density and mitigates Coulomb repulsion. These findings have important applications in nuclear astrophysics, particularly in the rapid neutron capture process (r-process), as well as in nuclear technology, where insights into deformed nuclei contribute to isotope development and reactor design. A deeper understanding of deformed nuclei in this research advances both fundamental nuclear physics and practical applications.
EXPLORING INNOVATIVE METHODS AND ALGORITHMS TO ACHIEVE GRACEFUL LABELLING FOR DIFFERENT CLASSES OF TREES
Authors: Noor jahan Fatima, Dr sarabjit kaur
Abstract: This study explores the structural characteristics of selected deformed nuclei using theoretical frameworks, focusing on their shapes, energy levels, and quadrupole moments. Nuclear deformation results from the complex interplay between shell effects and the strong nuclear force, causing deviations from spherical symmetry. Advanced models, including the Nilsson model, Hartree-Fock-Bogoliubov (HFB) theory, and collective models, are employed to examine the influence of deformation on nuclear structure. The key findings emphasize the role of deformation in shaping rotational spectra and intrinsic quadrupole moments, with significant results for nuclei such as 152Sm, 238U, and 240Pu. Calculations of quadrupole deformation parameters and energy levels show strong agreement with experimental data, validating the theoretical approaches. The study also investigates the stabilization of heavy nuclei through deformation, which redistributes charge density and mitigates Coulomb repulsion. These findings have important applications in nuclear astrophysics, particularly in the rapid neutron capture process (r-process), as well as in nuclear technology, where insights into deformed nuclei contribute to isotope development and reactor design. A deeper understanding of deformed nuclei in this research advances both fundamental nuclear physics and practical applications
Structural Performance Of Tall Buildings With Bracing And Infill Walls Under Lateral Loads
Authors: Rahul Kumar Satbhaiya, Jitendra
Abstract: High-rise buildings are particularly susceptible to lateral forces in seismically active regions. The primary consideration in their design is ensuring adequate resistance to these lateral stresses, as insufficient stability may lead to excessive displacement, structural instability, or even collapse. To mitigate such risks, buildings must be designed with effective lateral load–resisting mechanisms that enhance overall stability and serviceability. Among the commonly employed methods, steel bracing and masonry infill within reinforced concrete (RC) frames are recognized for their efficiency in resisting lateral loads. Steel bracing systems are advantageous due to their ease of installation, minimal space requirements, and ability to provide significant stiffness and strength with considerable design flexibility. Similarly, masonry infill can be executed efficiently with skilled labor and contributes to the overall lateral resistance of the structure. This study investigates the seismic performance of a reinforced concrete high-rise building of configuration R+12 (13 stories in total). Three structural configurations are evaluated: (i) bare frame, (ii) in filled frame with solid masonry, and (iii) frame with X-braced corner supports. The building model was developed and analyzed using CSI ETABS software, considering a three-dimensional asymmetric layout with a floor height of 3 m. Dynamic analysis was carried out using the response spectrum method for seismic Zone V under soft soil conditions, as specified by Indian seismic design guidelines. The results demonstrate that external steel bracing provides superior stability and reduced displacement compared to masonry infill and bare frames. In terms of both resistance and moment capacity, the steel bracing system proved to be the most effective lateral load–resisting system. Furthermore, from a cost-performance perspective, the steel tie system was found to be the most economical, followed by solid masonry infill, while the bare frame exhibited the least efficiency.
DOI: http://doi.org/
Use Of AI Tools To Enhanced Workplace Productivity
Authors: Dr. Shivani Budhkar, Chavan Krushna Rameshwar
Abstract: This paper investigates the role of Artificial Intelligence (AI) tools in modern workplaces, focusing on their potential to boost efficiency and overall productivity. AI-driven technologies provide organizations with advanced capabilities to streamline workflows, improve decision-making, and foster innovation. Drawing on existing research, industry reports, and case studies, the study highlights both the opportunities and challenges involved in adopting AI across different workplace settings. By analyzing real-world implementations and practical applications, this research offers actionable insights for organizations aiming to leverage AI as a means of achieving operational excellence and long-term strategic goals in the digital era.
DOI: http://doi.org/
Convolutional Neural Networks For Fault Detection In Software-Defined Vehicles
Authors: Sushil Panda
Abstract: The proliferation of software-defined vehicles (SDVs) has necessitated the development of sophisticated fault detection mechanisms capable of processing high-dimensional, multimodal sensor data in real-time. This paper presents a comprehensive analysis of Convolutional Neural Network (CNN) architectures for fault detection in SDVs, examining their theoretical foundations, implementation strategies, and performance characteristics. Through extensive experimentation and comparative analysis, we demonstrate that CNN-based approaches achieve superior performance compared to traditional rule-based and statistical methods, with accuracy improvements of 15-25% and false positive rates reduced by up to 40%. Our technical contribution includes a novel ensemble architecture combining 1D-CNNs with attention mechanisms for temporal sensor data analysis, achieving 94.7% accuracy in fault classification. The paper provides detailed mathematical formulations, algorithmic implementations, and empirical validation across multiple vehicle subsystems, establishing CNNs as the state-of-the-art solution for fault detection in modern automotive systems.
CAN Bus Data Prediction Using Temporal Neural Networks In Software-Defined Vehicles
Authors: Sushil Panda
Abstract: CAN bus in software-defined vehicles is vital to enhancing the vehicle’s performance, safety, and cybersecurity. The CAN bus is the digital nervous system of modern cars, handling the communication stream between all the Electronic Control Units (ECUs) of a Software Defined Vehicle. This research aims to provide a comprehensive security-aware framework for CAN bus data prediction using advanced temporal neural networks, which are designed for a cybersecurity-aware framework for SDVs. The paper proposes a new hybrid architecture that combines a Transformer-based attention mechanism with novel Graph Neural Networks (GNNs) to capture both temporal dependencies and network topology patterns in bus communications. The approach aims to address the challenges associated with high-frequency, complex time series data while ensuring compliance with ISO/SAE 21434 cybersecurity standards and ISO 26262 functional safety requirements. This is achieved by preserving privacy while simultaneously involving multiple vehicle training and capabilities for detecting real-time intrusion. This paper aims to implement a hybrid architecture with transformers and GNNs together on data using random functions in Python. The results thus obtained demonstrate a significant improvement in prediction accuracy (96.3%), cybersecurity threat detection (98.1% precision), and energy efficiency (34% reduction in computational overhead). The proposed framework achieved ASIL-C compliance and reduced false alarm rates by 31% compared to existing methods while maintaining sub-millisecond inference latency suitable for safety-critical automotive applications.
Application Of Neural Networks In Infotainment Systems Of Modern Vehicles.
Authors: Sushil Panda
Abstract: Autonomous vehicles and Electric Vehicles are the new definitions of modern vehicles. Centralised control over the vehicle and data-driven decision-making are the main features of these systems. User experience is the product, where vehicle and software combined, are the main selling features in the business world for an Automaker. Neural networks play a crucial role in the backend, providing real-time updates to the user about the vehicle’s status. The current study focuses on analysing the importance of the User interface and experience, as well as the important characteristic features of an infotainment system. This paper also presents a classic scenario of neural networks in State of Charge (SoC) monitoring for an Electric Vehicle, integrating real-time results updates with the User Interface and suggesting the user/driver through the dashboard, thereby enhancing data-driven communication and decision-making through an infotainment system in an automobile
DESIGN AND OPTIMIZATION OF DVFS-BASED VLSI CONTROLLERS FOR REAL-TIME VPP ENERGY MANAGEMENT
Authors: Anand Kumar Yadav
Abstract: Fortified Toward those extreme vitality emergency and the expanding consciousness around the necessity for Ecological protection, the proficient utilization of renewable vitality need turn into a heated point. The virtual control plant (VPP) will be a compelling method for coordination conveyed vitality frameworks (DES) successfully deploying them to force grid dispatching or power exchanging. In this paper, those working mode of the VPP with infiltration from claiming wind power, sun based force vitality stockpiling may be investigated. Firstly, the grid-connection prerequisites about VPP as stated by the current wind Also sun based photovoltaic (PV) grid-connection requirements, broke down its productivity need aid analyzed. Secondly, under a few average situations gathered a affiliation toward oneself guide (SOM) grouping calculation utilizing those VPP’s yield data, An benefit streamlining model may be created as An guideline for those VPP’s ideal operation. In light of this model, case investigations are performed and the effects show that this model may be both practical furthermore viable.
A Study of Iot Eco System & Role of Cloud Computing to Optimize Iot Performance
Authors: Dr. Jyoti, Associate Professor, Ms. Jyoti
Abstract: The rapid expansion of the Internet of Things (IoT) has led to a massive increase in the number of connected devices, generating large volumes of heterogeneous data. Managing this data and ensuring efficient device performance requires advanced computing infrastructures. Cloud computing offers a scalable and cost-effective platform to support IoT ecosystems by providing storage, processing, and analytics capabilities on demand. This study explores the integration of IoT ecosystems with cloud computing to optimize IoT performance. It examines IoT architecture, communication protocols, and data management strategies while highlighting the role of cloud-based services in reducing latency, improving scalability, and enhancing security. The research also emphasizes performance optimization through edge computing, load balancing, and intelligent resource allocation. The findings suggest that a well-structured IoT-cloud integration can significantly improve system efficiency, reduce operational costs, and enable real-time decision-making, paving the way for smarter and more sustainable IoT deployments.This paper lays the foundation for this research by introducing the concepts of cloud computing, IoT ecosystems, and deep learning, while highlighting their interdependencies and potential for performance optimization. The paper also outlines the motivation behind this study, identifies key challenges, and presents the significance and contributions of the research.
Reengineering Workforce Agility By Leveraging Core HCM Compensation And Performance Modules In Workday Ecosystems
Authors: Harish Govinda Gowda
Abstract: Containers have become the backbone of modern enterprise IT, providing portability, agility, and consistency across environments. However, scaling containers across hybrid and multi-cloud infrastructures requires more than orchestration—it demands governance, security, and resilience. This article explores how Kubernetes, Helm, and OpenShift can be harmonized to achieve container intelligence at scale. Kubernetes provides orchestration, Helm simplifies application deployment and lifecycle management, and OpenShift delivers governance, compliance, and enterprise-grade security. By layering these tools together, organizations can create resilient, scalable ecosystems that balance agility with trust. The discussion highlights key challenges in scaling containers, the role of each tool, and best practices for enterprise adoption, emphasizing that true resilience comes from harmonizing orchestration, management, and governance into one cohesive framework.
<!–
Container Intelligence at Scale: Harmonizing Kubernetes, Helm, and OpenShift for Enterprise Resilience
Authors: Harish Govinda Gowda
Abstract: Containers have become the backbone of modern enterprise IT, providing portability, agility, and consistency across environments. However, scaling containers across hybrid and multi-cloud infrastructures requires more than orchestration—it demands governance, security, and resilience. This article explores how Kubernetes, Helm, and OpenShift can be harmonized to achieve container intelligence at scale. Kubernetes provides orchestration, Helm simplifies application deployment and lifecycle management, and OpenShift delivers governance, compliance, and enterprise-grade security. By layering these tools together, organizations can create resilient, scalable ecosystems that balance agility with trust. The discussion highlights key challenges in scaling containers, the role of each tool, and best practices for enterprise adoption, emphasizing that true resilience comes from harmonizing orchestration, management, and governance into one cohesive framework.
Container Intelligence At Scale: Harmonizing Kubernetes, Helm, And OpenShift For Enterprise Resilience
Authors: Harish Govinda Gowda
Abstract: Containers have become the backbone of modern enterprise IT, providing portability, agility, and consistency across environments. However, scaling containers across hybrid and multi-cloud infrastructures requires more than orchestration—it demands governance, security, and resilience. This article explores how Kubernetes, Helm, and OpenShift can be harmonized to achieve container intelligence at scale. Kubernetes provides orchestration, Helm simplifies application deployment and lifecycle management, and OpenShift delivers governance, compliance, and enterprise-grade security. By layering these tools together, organizations can create resilient, scalable ecosystems that balance agility with trust. The discussion highlights key challenges in scaling containers, the role of each tool, and best practices for enterprise adoption, emphasizing that true resilience comes from harmonizing orchestration, management, and governance into one cohesive framework.
–>
MODIFIED RESNET-50 ARCHITECTURE For SCOLIOSIS DETECTION
Authors: Ronnel C. Mesia, Dr. John Lenon E. Agatep
Abstract: Scoliosis is a condition where the spine curves abnormally, which can cause discomfort, pain, and difficulties with movement. It is essential to detect and diagnose scoliosis as early as possible to prevent further complications and improve treatment outcomes (Brackett, 2023). The main goal of this study was to improve classification accuracy of ResNet-50 architecture in detecting scoliosis on unclothed human back images, enabling early detection and intervention to prevent the progress of the spine curvature. The modified ResNet-50 architecture in this study incorporates global average pooling and reduces the size of the fully connected layers in the original ResNet-50 architecture. The data used in this study consists of images of normal and with scoliosis unclothed human back images. The dataset was sourced from public repositories, private individuals and patients at President Ramon Magsaysay Memorial Hospital Iba, Zambales. These images were annotated and validated by medical experts from PRMMH. The Modified ResNet-50 model showed outstanding performance with slight fluctuation in validation loss similar to the findings in the study of Artates et. al (2024) that despite of minimal validation loss fluctuations the model can still be more robust and reliable. The Modified ResNet-50 model achieved impressive results and outperformed the baseline ResNet-50 across multiple evaluation metrics. The Modified ResNet-50 model reached an accuracy of ninety-seven percent (97%), both precision and recall values of ninety-six-point five percent (96.5), and F1-Score, Macro & Weighted Average of ninety-seven percent (97%). These results indicate that the model is highly effective in accurately classifying unclothed human back images.
High-Performance RF Mixers: A Comparative Study And Design Framework
Authors: Lalita Chouhan, Mr. Divyanshu Wagh
Abstract: Recently, the use of RF mixers has grown substantially. The Gilbert cell remains the de-facto core thanks to its high conversion gain, strong port-to-port isolation, and suppression of even-order distortion. Multi-tanh linearization implemented with multiple parallel differential transconductance stages offers excellent linearity but typically yields very low conversion gain. In contrast, current-bleeding improves both linearity and conversion gain by injecting additional bias current, at the cost of increased power consumption. Leveraging CMOS for its low cost, low static power, and compact area, we designed single-ended and differential low-noise amplifiers (LNAs) for WCDMA reception using the BSIM3v2 (Level-49) UMC 0.18-µm process in the Xcircuit open-source EDA tool. The designs prioritize high gain, low noise, good linearity, and input matching to a 50-ohm RF system.
An Introduction To Hand Gesture Controlled System Using Python
Authors: Minal Dhankar
Abstract: Hand gestures are a natural way to interact with computers, and with the advancement of computer vision and machine learning, this technology has become accessible for various applications. This paper explores the development and implementation of a hand gesture- controlled system using Python. The system discussed in this paper uses Python libraries such as OpenCV and Mediapipe to capture, recognize, and interpret hand gestures in real time, enabling user interaction with devices such as computers, robots, and smart systems without physical contact.
DOI: https://doi.org/10.5281/zenodo.17005234
Profiling Bacterial Community Structure In Banana (Musa Sapientum L.) Fruits: Insights From Illumina Next-Generation Sequencing
Authors: N. G. Ogbuji, N. B. Wofu, N. B. Wofu
Abstract: Banana (Musa sapientum.), a rich source of essential nutrients is a staple food crop in many tropical regions. In spite of its economic viability, its production is faced with numerous challenges including diseases and pests, which can significantly impact yields and quality. This study aimed to determine the bacterial organisms and the bacterial community structure associated with banana fruits using Illumina next generation sequencing (NGS) platform. Ripe and unripe banana fruits were sourced from Choba market in Obio-Akpor Local Government Area, Rivers State, Nigeria. Deoxyribonucleic acid (DNA) extraction was done using Laragen’s validated proprietary bacteria DNA extraction protocol. Polymerase chain reaction (PCR) amplification was performed to target the V4 region of the 16S rDNA gene using the conserved primers 515F and 806R. The most abundant phyla obtained from ripe and unripe banana were Proteobacteria (41.7%), Firmicutes (25.7%), Bacteroidetes (9.7%) and Actinobacteria (2.5%). The abundance of the predominant genera (Pseudomonas) was 41.7%. Other genera with relatively high abundance includes Bacteroides (25.94%) and Ruminococcus (9.94%). The result from this study shows that banana fruits are associated with beneficial and pathogenic organisms of human interest. Beneficial microorganisms such as Trabulsiella and Bacillus can be harnessed to promote plant growth and confer anti-fungal resistance in plants through biotechnology.
DOI: https://doi.org/10.5281/zenodo.17019678
Analyzing The Drivers And Constraints Of Air Conditioning Equipment Market Penetration In Nigeria: Implications For Sustainable Growth And Energy Policy
Authors: Ayodele Abeshin
Abstract: This paper presents a comprehensive analysis of the drivers and constraints influencing the market penetration of air conditioning equipment in Nigeria, with specific implications for sustainable growth and energy policy. Situated within a tropical climate and experiencing rapid urbanization and a growing middle class, Nigeria presents a significant, yet underexploited, market for cooling technologies. The methodology employed a quantitative survey design, administering a structured questionnaire featuring a five-point Likert scale to a purposive sample of 50 stakeholders including manufacturers, distributors and end-users. Data were analyzed using descriptive statistics via SPSS version 28 to quantify prevalent perceptions and trends. The findings clearly delineate a market caught between potent drivers and severe constraints. Key drivers identified include rising urbanization and housing development, a growing middle class, awareness of health and comfort benefits. Government programs promoting energy efficiency were also viewed favorably, though their impact was noted to be limited by enforcement. Conversely, the analysis reveals formidable barriers to widespread penetration. The high upfront cost of equipment and installation emerged as the most significant constraint, severely limiting affordability. This is compounded by Nigeria’s notoriously unreliable and inadequate electricity supply which increases operational costs and diminishes the perceived ease of use. Furthermore, a heavy dependence on imported units exposes the market to price volatility and foreign exchange fluctuations, while weak enforcement of existing energy policies and regulations fails to create a conducive environment for sustainable market growth. The penetration of air conditioning in Nigeria is a double-edged sword, representing both a marker of economic development and a potential threat to energy security and environmental sustainability. The study concludes that achieving sustainable growth requires a multi-faceted approach that simultaneously leverages the identified drivers and systematically addresses the constraints. Recommendations are therefore geared towards integrated policy actions: encouraging local manufacturing to reduce costs and import dependency; implementing financial incentives for energy-efficient units; strengthening enforcement of energy efficiency standards; investing in grid stability and renewable energy integration and launching public awareness campaigns to educate consumers on sustainable cooling solutions. Ultimately, this research provides a critical evidence-based framework for policymakers and industry stakeholders to foster a sustainable, accessible and efficient air conditioning market that aligns with Nigeria’s broader economic development and climate goals.
A Review On Analysis Of Different Types Of Cavities For Solar Collector
Authors: Merajul Hasan, Dr. P.N Ahirwar
Abstract: The design and geometry of cavities play a crucial role in determining the thermal performance and overall efficiency of solar collectors. Different cavity configurations influence heat absorption, heat loss reduction, and fluid heat transfer characteristics. This review presents a comprehensive analysis of various types of cavities employed in solar collectors, including cylindrical, conical, spherical, hemispherical, triangular, and compound cavity designs. The study emphasizes how cavity shape influences thermal performance. Comparative findings from experimental and numerical investigations reported in the literature highlight that well-optimized cavity geometries can significantly enhance solar energy utilization by improving heat retention and minimizing losses. The review also identifies key parameters influencing cavity performance, such as insulation, material properties, and flow arrangements, and provides insights into the suitability of different cavity types for diverse climatic and operational conditions. Finally, the paper outlines potential research directions focusing on hybrid cavity designs, advanced coatings, and integration with modern energy systems, aiming to further improve the efficiency and sustainability of solar thermal technologies.
DOI:
A Review On Analysis Of CO 2 Finned Tube Gas Coolers
Authors: Shubham Kumar, Dr. P.N Ahirwar
Abstract: The increasing demand for environmentally sustainable refrigeration and air-conditioning systems has accelerated research on carbon dioxide (CO₂) as a natural refrigerant, particularly in transcritical cycles. Gas coolers play a pivotal role in determining system efficiency, and finned tube configurations have emerged as a promising solution to enhance heat transfer performance while maintaining compactness. This review paper provides a comprehensive analysis of CO₂ finned tube gas coolers, highlighting their design considerations, thermal–hydraulic performance, and optimization techniques. Key factors such as fin geometry, tube arrangement, material selection, and flow distribution are critically examined in relation to heat transfer enhancement and pressure drop characteristics. Studies investigating numerical simulations, experimental measurements, and hybrid modeling approaches are summarized to identify performance trends and governing mechanisms. Furthermore, the influence of operating parameters such as mass flux, inlet temperature, and gas cooler pressure on system efficiency is discussed. The review also emphasizes advancements in surface modifications, louvered and wavy fin structures, and novel compact designs aimed at improving thermal performance. Current challenges such as high operating pressure, material durability, and cost-effectiveness are outlined, along with opportunities for integrating advanced manufacturing techniques and nanofluid applications. The findings of this review underline the importance of finned tube gas coolers in maximizing the potential of CO₂-based systems and provide insights for future research directions in sustainable refrigeration technologies.
DOI:
AI-Powered Soft Skills Training: Measuring The Efficacy Of VR Simulations For Cross-Cultural Competence In Global Tech Teams
Authors: Saravanan Balachandran
Abstract: In today’s interconnected and hybrid work environment, global tech teams frequently face miscommunication, collaboration inefficiencies, and cultural misunderstandings—issues rooted not in technical ability, but in the absence of robust Cross-Cultural Competence (CCC). As artificial intelligence (AI) and immersive technologies evolve, new possibilities have emerged for delivering soft skills training through Virtual Reality (VR)-driven experiential simulations. This study evaluates the effectiveness of AI-powered VR simulations in developing CCC among early-career professionals working in multinational technology firms. Grounded in Kolb’s Experiential Learning Theory (ELT), Bennett’s Developmental Model of Intercultural Sensitivity (DMIS), and Bandura’s Social Cognitive Theory, this mixed-methods study investigates how AI-adaptive VR modules influence user progression along intercultural sensitivity stages. The simulations are designed to mimic high-stakes multicultural workplace scenarios—such as remote team conflict, hierarchical ambiguity, and cross-cultural negotiation—leveraging real-time emotional feedback and decision-based branching to personalize the learning journey.Participants from global tech companies underwent structured VR-based learning interventions (3+ ELT cycles) over a four-week period. Pre- and post-assessments using the Intercultural Development Inventory (IDI) and Employer-Validated Global Soft Skills Index (EGSSI) revealed statistically significant growth in CCC, especially in adaptation and empathy. Qualitative reflections and biometric engagement data also indicated higher immersion and emotional resonance compared to video- or lecture-based training. The study offers compelling evidence that AI-augmented, VR-enabled soft skills programs can bridge cultural divides more effectively than traditional methods. It positions immersive technology as not just a medium, but a transformative agent in global workforce development.
What Are The Underlying Service Dimensions That Determine Customer Satisfaction For Vistara Airlines? Are Customers Satisfied Or Dissatisfied?
Authors: Meegada Maneeth
Abstract: This study explores the underlying service dimensions that determine customer satisfaction for Vistara Airlines, with particular focus on the SERVQUAL model—comprising Reliability, Assurance, Tangibles, Empathy, and Responsiveness. Analysis reveals that tangibles (such as modern aircraft, in-flight services, and physical comfort) and empathy (personalized care and attention) significantly contribute to customer satisfaction. Historically, Vistara has enjoyed a strong reputation for premium service, earning high customer satisfaction and industry accolades. However, recent developments—especially following its merger with Air India—have led to increased customer dissatisfaction, citing issues with staff responsiveness, flight reliability, and diminished service quality. While Vistara once set benchmarks in India’s airline industry, the gap between customer expectations and actual service delivery has widened post-merger, indicating a decline in perceived service quality. This shift highlights the importance of maintaining service consistency and customer trust during organizational transitions.
Unique Approach to Address Urban Air Pollution Using Iot Technology
Authors: Revelly Shiva Rani, Professor Dr.Ch.V.Phani Krishna
Abstract: The air contamination rates immediately moment of truth are drastically growing thoroughly the grown and the underdeveloped countries which demands a more handy and economical resolution.This system contains the design for listening air contamination and constituting knowledge among all. The aim at utilizing IOT in addition to cloud to make the aids actual time for action or event and faster. The system is equipped in the locality place skilled is severe air contamination. The level of each dicey pollutant is listened at annual pauses. The Air Quality Index (AQI) for the noticed pollutants is persistent and knowledge is generated between the public through an like a man app that displays the level of each noticed contaminant and more the air quality index on account of particular area. Thus the kind of air because area maybe implicit by all by considering the concentration of the vapor in two together mathematical and graphical layout. Further this arrangement is to be lengthened from now on by admitting all to register themselves in an app which pushes newspaper or weekly air condition report through communication which reaches the consumer as a announcement namely wealthier in approach.Nowadays air pollution has equipped expected individual of the important issues because of increase in the batch of tools and as long as spent machine control and urbanization. This growth in the level of adulteration causes success hurtful consequences for affluence. This project justifies the rendering and killing of an Air Pollution detection whole. The novelty grabbed in this place, is a hands-on killing of the plan of Internet of Things. This particularized work is an survey of the potential of consumption concerning this novelty, in this place experience, where everyday welfare is curving into a real risk. The work is actualized promoting calculating boss board of Arduino.
Advanced Encryption For Quantum-Safe Video Transmission
Authors: Gaddam UshaKiran, Dr.B.Srinivasa Rao(Professor)
Abstract: This project enables secure video processing, encryption, and watermark embedding, focusing on user authentication, video encryption, and decryption capabilities. Users can register, log in, and upload videos along with watermarks for processing. Using the cryptography library, each uploaded video is encrypted, and its encryption key is split using Shamir’s Secret Sharing, ensuring secure key distribution and storage. The encrypted frames are stored separately for later retrieval and decryption. Decryption occurs through reassembling key shares, allowing the original video to be reconstructed, with the watermark extracted from the first frame. The application further provides options to download the decrypted video, view split frames, and explore contact and performance information pages. Employing OpenCV for video processing and secure file handling techniques, this system ensures data confidentiality and integrity through a user-friendly interface and robust back-end encryption mechanisms. The application uses secure upload and storage mechanisms for sensitive data, like key shares and encrypted frames, storing them in predefined folders. Key shares are stored separately, further protecting the decryption process from unauthorized access.
Evaluating Diagnostic Accuracy In Jaw Pathologies On Orthopantomograms: A Comparative Study Between Oral Radiologists And AI-Driven ChatGPT Analysis
Authors: Dr. Yashika Kewalramani, Arjun Singh Parihar
Abstract: Artificial Intelligence (AI) and its incorporation into dental imaging, particularly in the interpretation of radiographs known as Orthopantomograms, has led to many promising advancements. However, its clinical utility and diagnostic consistency remain subjects of investigation when compared to the judgment of trained oral radiologists. This study evaluates the diagnostic precision and variability between experienced oral radiologists and a widely accessible AI model “ChatGPT”, in analyzing different confirmed Jaw Pathologies through Orthopantomograms. By using systematic assessment methods, the study aims to ensure a balanced and objective examination of the potential incorporation of AI in oral radiodiagnosis.
OPTIMIZING IOT SENSOR NETWORKS: TOPOLOGIES, DATA AGGREGATION, AND CLOUD INTEGRATION
Authors: Palwinder Kaur Sandhu
Abstract: The design and management of sensor networks, which enable smooth communication between a variety of devices, from home appliances to specialized monitoring equipment, are critical components of the Internet of Things (IoT) ecosystem. An effective sensor network’s design is greatly influenced by the topology chosen, such as mesh or star configurations, each of which is suitable for a specific application. As IoT adoption grows, the challenges of big data—volume, velocity, variety, and veracity—become more apparent. Since sensor data is inexpensive to generate but costly to transmit, store, and process, early-stage edge processing is essential for system efficiency. Modern, affordable, low-power aggregation devices reduce unnecessary data load by enabling local data processing, filtering, and transmission.Additionally, by providing remote configuration, real-time monitoring, and integrated data visualization, cloud-based sensor network management tools increase scalability and user-friendliness. Combining these technologies maximizes dependability, performance, and cost-effectiveness while satisfying the evolving requirements of Internet of Things applications.
Comparative Assessment Of Physico-Chemical Parameters Of Puliyampatti Pond Water And College Drinking Water
Authors: Ranjitha.S, Tamilaracy.K, Mary Jenifer.G, Dhilipan.M, D.Jeevanantham,B.E.
Abstract: Water quality plays a critical role in ensuring human health and well-being. This study compares the physico-chemical quality of water collected from Puliyampatti pond (near P.A. Educational Institution) and the treated drinking water supplied within the college campus. The parameters examined include pH, Total Dissolved Solids (TDS), chlorine content, and hardness. Results revealed that pond water exhibited higher TDS (489 ppm), hardness (3.6 mg/L), and lower chlorine (0.173 mg/L) compared to college drinking water, which showed a lower TDS (37 ppm), lower hardness (0.45 mg/L), but higher chlorine (1.3 mg/L). Both samples maintained pH within acceptable limits. The findings indicate that untreated pond water is unsuitable for direct consumption without treatment, while the treated college water meets desirable drinking water standards.
DOI:
Crop Yield And Disease Prediction By Using Data Mining Framework
Authors: Abhilasha Pokharna, Dr. Dinesh Shrimali
Abstract: As we all know, India is the world’s second most populous country, and agriculture employs the vast majority of its people. Farmers continue to plant the same crops without testing new varieties, and they apply fertilizers in haphazard amounts without comprehending the inadequate composition and quantity. As a result, agricultural output suffers while the soil becomes acidic and the player is damaged. So we created a solution to help farmers using machine learning techniques. Based on soil content and climatic conditions, our technology will choose the optimum crop for a certain piece of land. In addition, the system offers information on the necessary fertilizer content and quantity, as well as the seeds for growth.
AI-Powered Agritech Chatbot: Revolutionizing Crop Management And Disease Detection For Farmers
Authors: Miss . Chintapalli Lakshmi, Mr. kunjam Nageshwar Rao, P.Mohan rao 3
Abstract: India, as an agro-based economy, continues to have a substantial share of its population dependent on agriculture as the primary source of livelihood. However, productivity is often constrained by challenges such as limited access to timely information, difficulty in diagnosing plant diseases, and inadequate awareness of government schemes and market dynamics. Traditional reliance on manual methods or intermediaries frequently results in delays and misinformation, further hindering agricultural efficiency. To address these limitations, this paper presents an AI-powered Chatbot for Farmers, designed to deliver real-time, accurate, and accessible assistance. The system integrates Natural Language Processing (NLP) for query understanding, Convolutional Neural Networks (CNNs) with fine-tuned VGG-16 for plant disease detection, and machine learning models for crop recommendation and decision support. Furthermore, the chatbot incorporates multilingual support via translation APIs, enabling seamless interaction in regional languages and ensuring inclusivity across diverse farming communities. The proposed chatbot provides a wide range of services, including query resolution, crop suggestion, disease diagnosis from leaf images, and dissemination of critical updates on weather, market prices, and government policies. Experimental results demonstrate an accuracy of nearly 96% in disease classification and high precision in intent recognition, establishing the reliability and robustness of the system. By functioning as a virtual agricultural assistant, the solution empowers farmers with expert-level, user-friendly guidance, thereby enhancing decision-making, reducing losses, and ultimately improving agricultural productivity.
DOI: https://doi.org/10.5281/zenodo.17062303
Advancing Mental Health Diagnostics Via Social Media: A Comprehensive Review Of Machine Learning And Deep Learning Paradigms
Authors: Ms. Gude Kalyani
Abstract: Mental health challenges are rising worldwide, making early detection and monitoring increasingly important. With millions of people actively sharing thoughts and emotions on platforms like Facebook, Twitter, and Reddit, social media has become a valuable resource for understanding mental well-being. Earlier studies relied mainly on traditional machine learning (ML) techniques such as logistic regression, support vector machines, random forests, and ensemble models. These methods achieved only moderate results and often struggled with the complexity of natural language and diverse forms of data, limiting their effectiveness in real-world use. This work introduces a Mental Health Diagnostics framework that combines both social media data and personal details—such as age, family history, medical leave, and workplace challenges—to predict mental health conditions. The system applies a wide range of ML and deep learning (DL) approaches, with particular focus on a hybrid model that blends Bidirectional Long Short-Term Memory (BLSTM) with Convolutional Neural Networks (CNN). This design captures both sequential patterns and key contextual features, offering stronger predictive performance. Together with advanced models like RoBERTa and other ensemble methods, the proposed system achieves 99.6% accuracy. The findings demonstrate how integrating structured inputs with social media insights can create a reliable, scalable, and practical tool for mental health prediction, supporting early interventions and improved digital healthcare solutions.
Aeolus-DS: Dust-Aware AI Decision Support For Coccidioidomycosis (Valley Fever) A Design Science Research Framework Integrating Aerosol Remote Sensing, Land Disturbance, And Clinical Sentinel Signals
Authors: Harsha Sammang, Harshini Balaga, Aditya Jagatha
Abstract: Coccidioidomycosis (Valley fever), caused by Coccidioides spp., is a climate- and soil-mediated respiratory disease whose exposure arises from inhalation of spores entrained by wind from disturbed, desiccated soils. Incidence is rising across the U.S. Southwest and expanding arid zones. Traditional surveillance is retrospective and weakly coupled to dust-generating processes (drought, grading, off-road activity), limiting actionable lead time for clinicians, public health, and occupational safety. We present Aeolus-DS, a Design Science Research (DSR) artifact that fuses aerosol remote sensing (MAIAC AOD; dust fraction), mesoscale meteorology and soil moisture (ERA5), land-disturbance telemetry (construction and energy activity; off-highway vehicle events; nightlights), and clinical sentinel signals (syndromic ED chief complaints; pneumonia rule-out) into a dust-aware, AI-driven early warning and deci- sion support system. Methodologically, we propose a graph spatiotemporal transformer with direction-aware attention and physics-guided regularization reflecting aeolian transport. Us- ing county–week panels (2014–2024) for AZ–CA–NV, Aeolus-DS improves nowcasting MAE by 18% and two-week AUPRC by 21% over strong baselines (XGBoost, LSTM). Role-based “action cards” translate probabilistic forecasts and uncertainty into targeted mitigations (site watering cadence, temporary grading pauses, N95 staging, clinician test prompts). We eval- uate predictive skill, calibration, runtime, interpretability, and stakeholder usability, and discuss governance, ethics, and portability to other dust-borne mycoses in climate-stressed regions.
Review Of The ArcSWAT Model: Advances, Applications, And Future Directions
Authors: Divyansh Singh Nikhil
Abstract: Hydrological modeling plays a pivotal role in addressing contemporary challenges of water resource management, particularly in regions facing rapid urbanization, climate variability, and increasing anthropogenic pressures. Among the widely adopted modeling frameworks, the Soil and Water Assessment Tool (SWAT) and its ArcGIS-integrated version, ArcSWAT, stand out as versatile, semi-distributed, process-based tools designed for simulating the impacts of land use, climate, and management practices on watershed hydrology. ArcSWAT has been extensively applied across continents, from small agricultural watersheds to large river basins such as the Mississippi, Nile, and Ganga, providing insights into surface runoff, evapotranspiration, groundwater flow, sediment transport, and water quality dynamics. Its integration with Geographic Information Systems (GIS) enables seamless spatial analysis, making it particularly suited to data-scarce basins in developing regions. This review synthesizes the development, structure, and functionality of the ArcSWAT model, with particular emphasis on its global and Indian applications. The analysis highlights key advances in calibration and validation approaches, including the use of SWAT-CUP and the SUFI-2 algorithm, as well as emerging practices of multi-site calibration to enhance model robustness. Special attention is given to case studies from India, including the Gomti River basin, where urbanization, agricultural intensification, and climate variability necessitate advanced modeling frameworks for sustainable management. The review identifies major advantages of ArcSWAT, such as its ability to handle large heterogeneous basins, perform scenario-based analyses, and integrate global datasets (e.g., SRTM DEM, Landsat LULC, FAO soils). However, limitations are also recognized, including data dependency, underrepresentation of water quality processes in many studies, and insufficient scenario-based applications in Indian contexts. Future research directions are outlined, focusing on coupling ArcSWAT with machine learning approaches, integrating climate change projections, enhancing parameter sensitivity and uncertainty analysis, and expanding hydrology–water quality modeling. By critically assessing past applications and current research gaps, this review establishes ArcSWAT as both a proven tool and an evolving framework for hydrological research. Its continued development and integration with emerging technologies hold the potential to transform watershed management and policy-making in the era of climate change and increasing water stress.
DOI: https://doi.org/10.5281/zenodo.17066923
Machine Learning-Based Prediction Of Water Chemistry And Water Quality Index In The Gomti River, Lucknow”
Authors: Praveen Kumar Yadav
Abstract: Monitoring and maintaining water quality in urban rivers is crucial for ensuring both environmental sustainability and public health. The Gomti River, which flows through the densely populated city of Lucknow, faces severe stress due to rapid urbanization, untreated wastewater discharge, and growing anthropogenic pressures. This study focuses on predicting three key water quality parameters—pH, nitrate (NO₃), and biochemical oxygen demand (BOD)—which are widely recognized as critical indicators of river water health and are frequently used in Water Quality Index (WQI) assessment. To achieve this, the Extra Trees Regressor (ETR) model was applied to water quality datasets collected from the Central Water Commission monitoring station between 2016 and 2023. The dataset was pre-processed, normalized, and divided into training and testing subsets. Model performance was evaluated using statistical metrics such as R², MAE, MSE, and RMSE. The results demonstrated that ETR provided highly accurate predictions, achieving R² values above 0.95 for all three parameters while minimizing error values (MAE and RMSE). The predicted WQI values showed close alignment with actual observations, confirming the robustness and reliability of the model. These findings highlight the potential of machine learning-based approaches in forecasting river water quality and supporting timely, data-driven decision-making for pollution control and river management. This research contributes to environmental monitoring and geoscientific applications by demonstrating how ML methods can enhance water quality assessment, strengthen pollution mitigation strategies, and promote sustainable river basin management. Future work will focus on integrating satellite-based land use and meteorological data to improve spatial analysis and extending the modeling framework to other river systems, thereby improving generalizability and applicability.
DOI: https://doi.org/10.5281/zenodo.17067318
Student Face Verification System For GCE Exam Authentication In Zambia
Authors: Chilufya Sydney
Abstract: This paper introduces an open-source, simplified facial recognition system intended to prevent impersonation during Zambia’s GCE exams. The system uses Python, OpenCV, and SQLite to perform two basic functions: enrolling student facial data at pre-exam registration and verifying identities in real-time at exam entrance. With a resource-poor environment perspective, the solution is affordable (<$100 per exam site equivalent to about K3,000 in Zambian currency), simple to deploy, and ethically deployed. The initiative fills significant literatures gaps concerning the flexibility of facial recognition systems in sub-Saharan African testing regimes and ethical adoption of biometric technology in schooling. Analysis of technical operational performance happens under normal lighting and equipment use through dataset-based controlled benchmarking and usability estimates rely on stakeholder evaluations. Under perfect laboratory settings, the system achieves 96.4% accuracy while delivering 87.8% accuracy at low luminance levels using 1.45 seconds for each verification process. The thesis describes the entire system development lifecycle from its inception until assessment while presenting actual field results. The system presents a practical application to enhance test security within educational institutions of developing regions. The solution delivers operational tractability alongside technical sophistication by giving an applicable solution to combat examination impersonation without exceeding existing resource capabilities.
DOI: http://doi.org/
SCALABLE AND EFFICIENT APPROACHES TO GRACEFUL LABELLING IN GRAPH THEORY_290
Authors: Noor jahan Fatima, Dr sarabjit kaur
Abstract: Graceful labelling is a fundamental problem in graph theory with significant applications in communication networks, coding theory, VLSI circuit design, and combinatorial optimization. The graceful tree conjecture, proposed by Rosa in 1967, asserts that every tree can be assigned a graceful labelling, yet a general proof or counterexample remains elusive. Traditional constructive techniques and exhaustive searches have established results for specific graph families, but scalability challenges persist when dealing with larger instances. This paper investigates scalable and efficient approaches to graceful labelling by integrating constructive methods with heuristic and optimization-based strategies. We explore hybrid approaches that combine deterministic recursive labelling with stochastic metaheuristics such as genetic algorithms, simulated annealing, and tabu search. Additionally, we examine the role of integer linear programming (ILP), constraint satisfaction formulations, and parallel algorithms leveraging GPU acceleration and distributed computing frameworks. Experimental evaluations demonstrate that hybrid and parallel approaches outperform traditional heuristics in terms of scalability and efficiency, particularly for large trees and special graph families. Approximation-based relaxations are also shown to provide near-graceful solutions that guide heuristic refinement. Beyond computational advancements, this work highlights theoretical implications, including potential structural insights into the graceful tree conjecture and extensions to cyclic graphs. The proposed scalable frameworks not only advance computational verification but also contribute to bridging the gap between practical applications and theoretical challenges in graph labelling.
DOI:
Comparative Analysis Of Generic And Specialized Natural Language Processing Models Using Prompt Engineering
Authors: Sagar Gupta
Abstract: Recent advances in Natural Language Processing (NLP) have been driven by the widespread adoption of large-scale pretrained language models (LMs). While generic NLP models such as GPT, BERT, and T5 exhibit strong zero-shot and few-shot performance across diverse tasks, specialized NLP models (e.g., BioBERT, FinBERT, SciBERT) are fine-tuned on domain-specific corpora to achieve superior performance in targeted applications. With the emergence of prompt engineering as a method to guide large language models (LLMs), a new research challenge arises: can prompt engineering narrow the performance gap between generic and specialized models, or does domain-specific pretraining remain necessary? This paper provides a comparative analysis of generic and specialized NLP models under different prompt-engineering strategies, focusing on domains such as finance, healthcare, and legal text processing. Experimental findings indicate that while prompt engineering enhances the adaptability of generic LMs, specialized models continue to outperform in precision-critical tasks. The study underscores the complementary role of prompt design and domain-specific adaptation in the next generation of NLP systems
Recurrent Neural Networks In Complex Finance Applications
Authors: Sagar Gupta
Abstract: The financial domain is inherently dynamic, stochastic, and complex, making it one of the most fertile grounds for the application of advanced machine learning techniques. Among these, Recurrent Neural Networks (RNNs) have emerged as particularly well-suited for modeling sequential and temporal dependencies in financial data. This paper explores the role of RNNs in complex finance applications, tracing their evolution from basic time-series forecasting to modern variants such as Long Short-Term Memory (LSTM) networks and Gated Recurrent Units (GRUs). The discussion highlights applications in algorithmic trading, credit risk assessment, fraud detection, portfolio optimization, and regulatory compliance. Case studies are presented to illustrate both the potential and the limitations of RNNs in finance. The paper concludes with a critical discussion of challenges such as interpretability, overfitting, adversarial risks, and future research directions, including hybrid neuro-symbolic architectures and transformer-RNN hybrids for financial intelligence
Evolution Of A Neural Network In ERP Implementations
Authors: Sagar Gupta
Abstract: Enterprise Resource Planning (ERP) systems have long served as the backbone of organizational information systems, integrating finance, operations, human resources, supply chains, and customer-facing processes into unified platforms. Traditionally, ERP implementations relied on rule-based configurations and deterministic workflows. However, the evolution of neural networks has introduced adaptive, data-driven intelligence into ERP ecosystems. Neural architectures are increasingly being deployed to enhance demand forecasting, anomaly detection, process optimization, and user personalization within ERP systems. This paper traces the evolution of neural networks in ERP implementations, from early adoption in predictive analytics to contemporary applications in autonomous process automation and decision intelligence. It also explores case studies, challenges, and future research directions, highlighting the transformative potential of neural networks in reshaping the ERP landscape
Crowdsource Activity As Applications Of Neural Networks
Authors: Sagar Gupta
Abstract: Crowdsourcing has emerged as a powerful mechanism for harnessing distributed human intelligence at scale, enabling diverse applications such as data annotation, collective problem solving, and decision-making across domains. With the advent of neural networks, crowdsourced activity has been both a source of critical training data and an arena for deploying advanced artificial intelligence systems to optimize participation, reliability, and outcome quality. This paper explores the intersection between crowdsourced activity and neural networks, emphasizing how neural architectures are applied to classify, validate, and enhance crowd contributions. The discussion spans natural language processing, computer vision, recommendation systems, quality assurance, and hybrid human–AI collaboration frameworks. The review concludes with challenges in scalability, bias mitigation, and ethical considerations, highlighting emerging opportunities for integrating neural networks to reshape crowdsourced ecosystems.
Optimized Neural Network For PV, Battery, Supercapacitor DC microgrid
Authors: Sharma Pankaj Kanhaiya, Professor Devendra Sharma, Professor Saurabh Gupta
Abstract: The integration of photovoltaic (PV) systems with battery and supercapacitor storage in DC microgrids demands efficient energy management to enhance system stability, reliability, and operational efficiency. This research presents an optimized neural network-based energy management approach tailored for a standalone DC microgrid incorporating PV panels, lithium-ion batteries, and supercapacitors. The neural network model is specifically designed to handle the nonlinear characteristics of the microgrid, optimize power flow, and maintain the state of charge (SoC) of energy storage devices within safe limits. By utilizing advanced training algorithms inspired by optimization techniques such as artificial rabbit optimization, the proposed system achieves improved prediction accuracy and load balancing. The approach also integrates a fuzzy logic control mechanism to facilitate real-time adaptive responses to dynamic load changes and renewable generation variability. Simulation results demonstrate enhanced voltage stability, reduced power fluctuations, and efficient energy distribution compared to conventional methods. This optimized neural network strategy effectively mitigates the challenges inherent in hybrid energy storage management, promoting longer battery life, quicker response times from supercapacitors, and overall system resilience. The study contributes significant insights toward the development of intelligent energy management systems for sustainable and autonomous DC microgrid applications ((PDF) Artificial Rabbits Optimized Neural Network-Based Energy …, 2024).
Simulation And Comparative Analysis Of Unsymmetrical Faults On Grid Interconnection With ANN-Based Fault Classification”
Authors: ER Sandeep Tandon
Abstract: This paper presents a detailed simulation and analysis of unsymmetrical faults (LG, LL, LLL) in a three-phase grid interconnection using MATLAB/Simulink. The model includes two voltage sources representing grid ends, connected via a two-pi section transmission line to simulate realistic power transfer conditions. The system response to each fault type is analyzed in terms of voltage and current distortions. Separate fault simulations are carefully modeled using practical parameters. Output data is collected using Workspace blocks and statistically analyzed to extract minimum, maximum, and mean values. Comparative tables and waveform plots illustrate the behavior of each fault type. Additionally, the paper discusses the theoretical basis of fault currents using symmetrical components. The study aims to serve as a base model for educators, researchers, and developers of AI-based fault detection systems.
DOI:
Study And Analysis Of Railway Bridge Piers Using Mathematical And Computational Computing System
Authors: Nikhil Gaur, Dr. Jyoti Yadav
Abstract: Most of the sub-structures of new railway river bridges in India are built with solid mass concrete gravity piers and abutments. These piers, designed without steel reinforcement, rely on the assumption that they are not subjected to tensile stresses under regular loading. However, during high-magnitude earthquakes, their safety becomes a critical concern, particularly in seismically active regions of India. This study assesses the seismic vulnerability of solid gravity bridge piers, which are key components of railway bridges, since they transfer loads between the substructure and the superstructure. Seven existing piers from the state of Gujarat were analyzed using free vibration analysis and nonlinear static (pushover) analysis in ABAQUS. Free vibration analysis revealed that the fundamental mode mass participation was always below 50%, while the cumulative participation of the first six modes remained under 80%, demonstrating significant contributions from higher vibration modes. Pushover analysis results confirmed the limited ductility of solid piers and highlighted their susceptibility under seismic excitations. The study emphasizes the need for seismic strengthening strategies to ensure the safety and serviceability of such piers.
DOI: https://doi.org/10.5281/zenodo.17081435
Study And Analysis On The Lateral Bearing Capacity Of Cantilever Rigid Piles Of Bridges
Authors: Nikhil Gaur, Dr. Jyoti Yadav
Abstract: To investigate the lateral ultimate bearing capacity of cantilever rigid piles subjected to large horizontal displacement, this paper analyzes the distribution characteristics of soil resistance along the pile side and explores calculation methods for lateral bearing capacity of pile foundations using both numerical simulation and theoretical approaches. The results indicate that, under large displacement conditions, the soil in front of the pile yields progressively from top to bottom. Once the soil adjacent to the pile reaches its limit displacement, the lateral soil resistance no longer increases with further displacement. The ultimate lateral bearing capacity of cantilever rigid piles under large horizontal displacement is determined based on the ultimate displacement of the side soil. Among the tested approaches, the modified “m” method demonstrates the best fitting accuracy. However, further investigation is required to define the applicable range of foundation coefficient distribution.
DOI: https://doi.org/10.5281/zenodo.17081470
Quantifying The Spatiotemporal Dynamics Of The Surface Urban Heat Island In Lucknow, India
Authors: Praveen Kumar Yadav, Kundan Bhushan, Er. Manoj Kumar Yadav
Abstract: Rapid urbanization is a primary driver of local climate change, leading to the formation of the Surface Urban Heat Island (SUHI) effect, which poses significant environmental and public health challenges. This study presents a comprehensive spatiotemporal analysis of the SUHI phenomenon in Lucknow, India, over a decade (2014–2024) by leveraging the analytical power of the Google Earth Engine (GEE) platform and ArcGIS. Using annual mean Land Surface Temperature (LST) derived from Landsat 8 thermal imagery, we employed two distinct metrics to quantify the SUHI effect: statistical Urban Hot Spot (UHS) analysis and the Urban Thermal Field Variance Index (UTFVI). SUHI hotspots were identified as areas with LST exceeding two times standard deviations above the regional mean (LST > μ + 2σ), while the UTFVI was used to classify the urban environment into six levels of thermal comfort. The results reveal a significant intensification and spatial expansion of the SUHI effect over the study period. The total area identified as a Urban hotspot increased from 25 km² in 2014 to 26 km² in 2024, a growth of over 4%. Concurrently, the area experiencing the worst ecological conditions (“Worst” UTFVI zone) expanded from 1,038 km² to 1,050 km² a growth of 1.16% . These high-temperature zones are predominantly concentrated in the city’s central commercial core and newly developed residential areas, correlating with the expansion of impervious surfaces. This research provides quantitative evidence of Lucknow’s escalating thermal risk and underscores the utility of GEE and geospatial indices for monitoring urban environmental health. The findings offer critical insights for policymakers and urban planners to develop targeted heat mitigation strategies, such as the strategic implementation of green infrastructure.
Forensic Analysis Of NTFS: Structure, Vulnerabilities, And Novel Recovery Techniques
Authors: Anish Kumar, Sourav ray, Ambrose Henrey Mwikwabe, Shreya Gandh, Rohit Kumar Singh
Abstract: The New Technology File System (NTFS) is the default file system for modern Windows and contains rich metadata (journaling, security descriptors, etc.) that aids forensic investigations. Its Master File Table (MFT) holds records for every file (even deleted ones), while transactional logs ($LogFile and $UsnJrnl) record detailed changes . However, NTFS also offers covert storage (alternate data streams, directory $DATA, and boot record slack) and exhibits known integrity flaws. This paper reviews current NTFS forensic methods – including MFT parsing, journal analysis, and hidden-data detection 3 4 – and identifies weaknesses (e.g. limited $MFTMirror backup, unexamined boot sector areas 6). We propose novel recovery techniques: an enhanced boot-sector reconstruction algorithm (combining backup boot data with $LogFile-derived geometry) and an improved metadata restoration process that leverages $LogFile and signature scanning when the MFT is damaged. We demonstrate these on synthetic NTFS images and show improved recovery of system structures and hidden content compared to baseline tools. The contributions include new forensic workflows and illustrative diagrams of NTFS layout and analysis steps.
Geostatistical And Machine Learning Framework For PM₂.₅ Prediction In Urban Uttar Pradesh, India
Authors: Manoj Kumar Yadav, Deepak Kumar Singh
Abstract: Air pollution has emerged as one of the most serious environmental and public health challenges in South Asia, with fine particulate matter (PM2.5) identified as the most pernicious pollutant due to its ability to penetrate deep into the human respiratory system. Uttar Pradesh, the most populous state in India, frequently records PM2.5 concentrations that exceed national and international standards. This study presents an integrated framework that combines geostatistical interpolation and machine learning regression to predict PM2.5 levels across ten non-attainment cities in Uttar Pradesh. Daily PM2.5 data for the period 2021–2024 were obtained from continuous monitoring stations and subjected to rigorous preprocessing. Spatial interpolation using Ordinary Kriging was implemented to generate high-resolution exposure surfaces, while machine learning algorithms including Random Forest, Gradient Boosting Regressor, Extreme Gradient Boosting, Support Vector Regression, and K-Nearest Neighbour were trained to capture temporal and spatial variability. Results demonstrate that PM2.5 concentrations consistently exceeded permissible limits, with pronounced seasonal peaks in winter and relative minima during monsoon months. Kriging revealed spatial clustering of pollution hotspots in Ghaziabad, Kanpur, and Lucknow, while peripheral cities exhibited lower but still concerning levels. Among machine learning models, XGBoost achieved the highest predictive performance with R² values above 0.74, followed by Gradient Boosting. Integration of Kriging-derived features into machine learning workflows improved prediction accuracy by 8–12%. The study demonstrates that hybrid geostatistical–machine learning approaches provide reliable and high-resolution PM2.5 predictions, enabling early-warning systems, spatially targeted interventions, and evidence-based policy planning.
FIELD VISIT REPORT ON THE WATER TREATMENT PLANT AND COMBINED WATER SUPPLY SCHEME AT POLLACHI
Authors: Sangeeth Kumar.J, Janarthanan.V, Logeshwaran.S, D.Jeevanantham,B.E
Abstract: Water treatment plants (WTPs) play a fundamental role in delivering safe and reliable drinking water to urban and rural populations. This journal paper documents a field visit to the Pollachi Water Treatment Plant (WTP) located at Kolathur Village, Pollachi Taluk, Coimbatore District, which is part of the Combined Water Supply Scheme (CWSS) supplying Pollachi North, Pollachi South, Kinathukadavu, Gudimangalam, and adjoining habitations. The scheme sources water from the Aliyar River, with an intake well and raw water pump house that lifts water for treatment. The plant consists of headworks, aerator, stilling chamber, flash mixers, dividing chambers, clariflocculators, filter beds, clear water sump, and chemical treatment units for coagulation, flocculation, and disinfection. During the visit, the operation of raw water pumping mains, filter media layers, chlorination arrangements, laboratory facilities, booster pumping stations, and service reservoirs were observed. With a designed treatment capacity of 26.38 MLD, the scheme ensures reliable water supply to urban wards and more than 200 rural habitations. This field exposure enabled students to understand the engineering design and operational aspects of drinking water treatment and distribution, bridging theoretical knowledge with field practice
FIELD VISIT REPORT ON THE WASTEWATER TREATMENT PLANT AT POLLACHI
Authors: Mohamed Asiq .A, Santhosh M, Vishal.S, D.Jeevanantham,B.E
Abstract: Wastewater treatment is essential for safeguarding public health, protecting ecosystems, and supporting sustainable urban development. This report presents insights from anacademic field visit to the Government Wastewater Treatment Plant (WWTP) at Pollachi, Tamil Nadu. The plant is based on Sequential Batch Reactor (SBR) technology, which provides an efficient and compact solution for secondary treatment of municipal sewage. During the visit, students observed the general layout of the facility, including preliminary units (receiving chamber, screens, grit chambers), secondary biological treatment (SBR reactors, decanters), tertiary treatment (chlorination chambers, contact tanks), and sludge handling units (sludge well, centrifuge building). The plant also houses supporting infrastructure such as laboratory facilities, blower rooms, and landscaped green belts that enhance both aesthetics and environmental protection. The visit provided practical exposure to treatment operations, sludge management, effluent quality monitoring, and safety protocols. It also highlighted the broader significance of WWTPs in ensuring sustainable sanitation, preventing water pollution, and promoting wastewater reuse. This report connects classroom knowledge of environmental engineering with real-world field practice, emphasizing the critical role of wastewater treatment plants in urban infrastructure.
Care Smart AI Hospital Management System
Authors: Saiyed Aiyatullah Kalimullah, Malek Mohammadarsh Mohammedasif, Jethava Shyam Hiteshbhai, Chudasma Dhruv Dineshbhai
Abstract: This project presents Care Smart AI, a compre- hensive Hospital Management System (HMS) integrated with artificial intelligence to improve healthcare delivery and op- erational efficiency. The system leverages modern full-stack technologies including Flask for backend API services, MongoDB for data persistence, React and TailwindCSS for responsive user interfaces, and machine learning for symptom assessment and diagnostic report summarization. Care Smart AI enables secure, efficient patient management with role-based access for patients, doctors, and administrators. It demonstrates a scalable, acces- sible, and intelligent platform that enhances clinical decision- making, automates administrative tasks, and improves patient care quality across healthcare institutions.
PROPERTY HUB
Authors: Amir Shabbir Patel, Sahil AA Khan, Dhairya Suryawanshi, Rohan Karchuli
Abstract: The real estate industry is currently experiencing a rapid digital transformation, largely fueled by the integration of artificial intelligence (AI) technologies. Among the most promising applications is the use of AI-powered recommendation systems, which aim to redefine how buyers, sellers, and agents interact with property platforms. These intelligent systems are designed to analyze large volumes of property data and user preferences, offering highly personalized recommendations that improve the overall user experience. By leveraging data-driven insights, AI has the potential to simplify property discovery, reduce the complexity of decision-making, and enhance overall market efficiency. This study explores the implementation of different AI models, including machine learning algorithms, deep learning techniques, and natural language processing (NLP), within the context of real estate platforms. We evaluate their ability to process structured and unstructured data such as location, price, amenities, and even user reviews or natural language queries. A prototype recommendation system was developed and tested using real user behavioral data, including browsing history, clicks, and saved properties. The case-based experiment demonstrated that AI- enabled recommendations not only improved engagement but also significantly reduced search time, making the property- hunting process more efficient and user-centric. In addition to the technical benefits, this paper also examines the broader challenges and ethical considerations associated with AI adoption in real estate. Issues such as data privacy, algorithmic bias, and transparency in recommendations are highlighted as key areas that require careful attention. Furthermore, the study identifies opportunities for future research, such as integrating predictive analytics for market trends, enhancing trust through explainable AI, and expanding personalization by considering emotional and lifestyle factors. By addressing these challenges and advancing the current models, AI-driven recommendation systems can play a transformative role in shaping the future of the real estate industry. [4].
Construction Of Environmental Quality Index Of Lucknow City For Assessment Of Public Health (2020–2024)
Authors: Priya Jaiswal
Abstract: This study presents the development and evaluation of an Environmental Quality Index (EQI) for Lucknow city, aimed at assessing the environmental factors that influence public health outcomes. The EQI is designed to integrate three critical environmental components—air quality, water quality, and green cover—which are known to have direct and indirect effects on human health. Data spanning from 2020 to 2024 were collected from reputable government sources, including the Central Pollution Control Board (CPCB), Central Ground Water Board (CGWB), and the Forest Survey of India (FSI). These datasets were systematically processed, analyzed, and normalized to create a composite index that represents the overall environmental condition of the city in relation to public health risks. The results indicate that Lucknow’s environmental quality generally falls within the moderate to poor range, reflecting significant challenges for maintaining population health. Rising levels of air pollutants, persistent water contamination, and limited improvement in urban green spaces collectively contribute to increased vulnerability to respiratory, cardiovascular, and waterborne diseases. Year-wise analysis reveals gradual deterioration in air and water quality, highlighting the urgent need for targeted public health interventions and environmental management strategies. The EQI developed in this study provides a valuable tool for policymakers, health authorities, and urban planners to identify high-risk areas, prioritize interventions, and monitor the effectiveness of measures aimed at reducing environmental health hazards.
DOI: https://doi.org/10.5281/zenodo.17098074
The Emergence of “New Markets” Under The Changed Global Scenario
Authors: Ms. S. Sushma Rawath
Abstract: The rapidly evolving global scenario, characterized by technological advancements, geopolitical shifts, and socio-economic transformations, has led to the emergence of “new markets.” These markets are driven by a combination of digital innovations, environmental imperatives, demographic changes, and evolving consumer preferences. Opportunities in areas like renewable energy, digital finance, sustainable agriculture, and advanced healthcare define new markets that are no longer constrained by traditional industrial or geographic boundaries. Furthermore, the globalization of technology and digital platforms has enabled businesses to access previously untapped regions and demographics, particularly in developing economies. This paper explores the drivers behind these emerging markets, their implications for global trade, and strategies businesses can adopt to thrive in this dynamic environment. It also highlights the challenges associated with navigating regulatory complexities, cultural differences, and technological disparities. Understanding and adapting to these new markets is crucial for fostering inclusive and sustainable economic growth in the 21st century.
AI-Powered Assistive Vision: A Novel Deep Learning Framework For Object Detection And Recognition For The Visually Impaired
Authors: Miss. Mounika Lokavarapu, Dr.G. Sharmila Sujatha
Abstract: Object recognition plays a crucial role in computer vision applications, particularly in assisting visually impaired individuals for safe and independent navigation. Despite its significance, existing techniques often face limitations in recognizing multiple objects efficiently and accurately. The aim of this work is to develop a robust multi-label object recognition framework capable of detecting and classifying surrounding objects in real time to enhance situational awareness for visually impaired users. The proposed system takes real-world images as input and processes them using machine learning and advanced computer vision algorithms. A multi-label classification approach is employed to simultaneously detect and group objects, reducing detection time while improving recognition accuracy. By leveraging deep learning models with optimized type/grouping techniques, the system achieves faster execution with best-in-class time complexity. Experimental analysis demonstrates that the framework not only improves detection performance but also provides reliable object recognition in both indoor and outdoor environments, making it highly effective for real-world navigation assistance. The proposed framework, “AI-Powered Assistive Vision: A Novel Deep Learning Framework for Object Detection and Recognition for the Visually Impaired,” is developed using Python with TensorFlow/Keras and OpenCV libraries, and implemented under embedded hardware with camera and processing units, enabling real-time deployment for assistive navigation applications.
A Comprehensive Evaluation Of The Water Quality Of The Saryu River In Ayodhya Based On Physico-Chemical Parameters
Authors: Vishal Yadav, Manas Mishra, Aditya verma
Abstract: In Ayodhya, Uttar Pradesh, India, the Saryu River is revered as a sacred river. However, the quality of the water is declining as a result of human anthropogenic activities. The goal of the current study was to use established techniques to evaluate the quality of river water by analyzing bacterial populations and physicochemical characteristics with seasonal fluctuations. The majority of the physicochemical parameters, primarily pH, DO, BOD, and TDS, were found to be within the allowable levels that regulatory bodies had suggested. Other parameters, such as Alkalinity or Fluorite and chemical oxygen demand (COD), were marginally above the allowable limits. Microbial investigations revealed the existence of both fungal and bacterial communities. The rainy season has the highest bacterial concentration, followed by the summer and winter seasons. The results of this study may help with irrigation and drinking water quality monitoring throughout the year
DOI: https://doi.org/10.5281/zenodo.17098581
Enhancing Cyber Defence Through Supervised Machine Learning Experimental Evaluation On The NSL-KDD Dataset
Authors: Mrs. Kocherla Jayanthi
Abstract: The rapid evolution of cyber threats demands effective and adaptive intrusion detection systems to protect critical network infrastructures. This study seeks to evaluate the efficacy of supervised machine learning models in detecting network intrusions using the NSL-KDD dataset. The NSL-KDD dataset, a well-established benchmark for intrusion detection, undergoes thorough pre-processing, including handling missing values, feature normalization, and categorical encoding to ensure high-quality input data. We implement a range of supervised machine learning algorithms Decision Tree, Random Forest, Naïve Bayes, K-Nearest Neighbours (KNN), Gradient Boosted Trees, and Support Vector Machine (SVM) to classify network traffic as either benign or malicious. The process involves splitting the dataset into training and testing subsets, followed by hyperparameter optimization through grid search to enhance model performance. We evaluate the models using key metrics such as accuracy, confusion matrix, Receiver Operating Characteristic (ROC) curve, and Area Under the Curve (AUC). Our findings reveal that Random Forest and Gradient Boosted Trees achieve superior accuracy and lower false positive rates compared to other classifiers. The comparative analysis provides practical insights into each algorithm’s strengths and limitations for cybersecurity applications.
REMOVAL OF HEAVY METALS FROM TANNERY EFFLUENT USING AGRO-WASTE LOW-COST ABSORBENTS
Authors: Shivendra Singh, Manoj Yadav
Abstract: Tannery effluents are a major source of chromium contamination, particularly hexavalent chromium [Cr(VI)], which is highly toxic, carcinogenic, and persistent in aquatic systems. Conventional treatment methods for chromium removal are often costly and environmentally unsustainable. This study investigates the potential of low-cost, eco-friendly agro-waste materials—sawdust, clay, and used tea leaves—as adsorbents for the removal of both total chromium and hexavalent chromium from tannery effluent. Batch adsorption experiments were carried out to evaluate the effect of parameters such as pH, contact time, adsorbent dosage, and initial chromium concentration. The results demonstrate that all three materials exhibit significant adsorption capacities, with efficiency varying across the different adsorbents. Sawdust and used tea leaves showed higher affinity towards Cr(VI), while clay exhibited better overall performance in reducing total chromium levels. The adsorption process was found to follow pseudo-second-order kinetics and fit well with the Langmuir isotherm model, suggesting monolayer adsorption on homogeneous surfaces. This study highlights the feasibility of employing locally available agro-waste adsorbents as sustainable alternatives to conventional methods for the treatment of tannery wastewater, thereby contributing to cost-effective and environmentally friendly wastewater management
Assessment Of Ambient Air Quality Prayagraj City During Post Monsoon 2024
Authors: Vipin Kumar, Manvika Chaudhary, Manoj Kumar Yadav
Abstract: Air pollution is a pressing concern in Indian cities, particularly due to increasing vehicular load, industrial expansion, and urban activities. This study assessed the ambient air quality of Prayagraj city during the post-monsoon season of 2024 across residential, commercial, and industrial zones. The monitoring focused on PM₁₀, PM₂.₅, SO₂, NO₂, and trace metals (Pb and Ni), and the results were compared with the National Ambient Air Quality Standards (NAAQS).The analysis revealed that PM₁₀ levels in residential areas ranged between 112–125 µg/m³, with Rambagh and Georgetown recording the highest values, nearly double the NAAQS limit of 60 µg/m³. Commercial a 150 µg/m³, while industrial zones such as the Naini Industrial Area peaked above reas exhibited even higher concentrations, with CMT and Johnstonganj exceeding 160 µg/m³, indicating severe exceedances. Similarly, PM₂.₅ concentrations ranged from 38–52 µg/m³ in residential areas, while commercial locations consistently surpassed 80 µg/m³, far above the NAAQS standard of 40 µg/m³. Industrial sites again recorded the highest PM₂.₅ levels, approaching 100 µg/m³.In contrast, gaseous pollutants showed moderate levels. SO₂ concentrations remained between 20–35 µg/m³ in residential and commercial zones and around 40 µg/m³ in industrial areas, all within the permissible limit of 80 µg/m³. NO₂ levels averaged 25–40 µg/m³ in residential areas, with commercial hotspots reaching 45 µg/m³, and industrial sites recording around 50 µg/m³, still below the standard of 80 µg/m³ but indicative of vehicular and industrial influence.Overall, the study demonstrates that particulate matter (PM₁₀ and PM₂.₅) is the most critical pollutant in Prayagraj, with concentrations 2–3 times higher than the standards, while SO₂ and NO₂ remained within safe limits. These findings highlight the urgent need for targeted emission control strategies focusing on traffic management, industrial regulation, and dust mitigation.
Temporal Trend Evolution Mapping In Scientific Literature
Authors: Poonam Mishra, Neeraj Gupta
Abstract: The rapid acceleration of scientific publication rates has created unprecedented challenges in tracking the evolution of research trends and identifying emerging paradigms within academic disciplines. This paper presents a novel computational framework for temporal trend evolution mapping in scientific literature that combines advanced natural language processing techniques with dynamic network analysis to capture and visualize the progression of scientific concepts over time. Our methodology integrates transformer-based document embeddings, temporal clustering algorithms, and graph-based trend propagation models to create comprehensive maps of knowledge evolution across multiple time scales. The framework employs a multi-dimensional approach that analyzes citation patterns, semantic similarity evolution, and author collaboration networks to identify trend emergence, maturation, and decline phases. Experimental validation on large-scale datasets from PubMed, arXiv, and Web of Science demonstrates the framework’s effectiveness in detecting significant research trends up to 18 months before they become mainstream, with precision scores exceeding 0.89 for trend prediction tasks. The system successfully identified the emergence of CRISPR gene editing research, COVID-19 therapeutic developments, and artificial intelligence applications in drug discovery as major trending topics months before their widespread recognition. Our contribution provides researchers, funding agencies, and academic institutions with powerful tools for strategic research planning, early trend identification, and comprehensive understanding of scientific knowledge evolution patterns.
Experimental Study On The Properties Of Concrete Using Marble Powder And Steel Fibres As Partial Cement Replacement (20, Bold)
Authors: Deepak Kumar Mishra
Abstract: Concrete, a fundamental material in construction, is increasingly being modified to incorporate sustainable alternatives that enhance performance while minimizing environmental impact. This study investigates the effects of partially replacing cement with marble powder and adding steel fibres in varying proportions (0%, 0.5%, 1%, 1.5%, and 2.0%) on the mechanical properties of M25 grade concrete. Results show that a mix containing 15% marble powder and 1% steel fibre achieves optimal compressive, split tensile, and flexural strength at 28 days. The marble powder improves workability due to its smooth texture and spherical shape, while the addition of steel fibre, though reducing workability, enhances bonding and overall strength. The findings suggest that the combination of marble powder and steel fibres can be effectively used in structural applications such as multistoried buildings and bridges. A recommended optimal mix of 15% marble powder and 1% steel fibre offers the best performance, though further long-term studies are advised to assess durability and field performance
Air Quality And Public Health In Lucknow: Long-Term PM2.5 Exposure, Seasonal Variability, And Policy Implications
Authors: Himanshu Ranjan, Manoj Kumar Yadav, Sushant Kumar
Abstract: The Indo-Gangetic Plain’s Lucknow, a metropolis that is quickly urbanising, is seeing dangerously high levels of tiny particulate matter (PM2.5), which endanger both the environment and human health. In contrast to Delhi, which has seen a great deal of study on air quality, Lucknow has not received as much attention despite its increasing industrial emissions, biomass burning, and vehicle traffic. This study looks at the temporal and geographical trends of PM2.5 in Lucknow, identifies the main sources of emissions, and uses exposure-response relationships to assess the health risks associated with these findings. Data from state monitoring stations and the CPCB were used to evaluate the daily and seasonal variations in PM2.5 concentrations. in addition to meteorological factors. According to the findings, steady atmospheric conditions and biomass burning cause PM2.5 levels to peak throughout the winter months, with concentrations frequently above both national and WHO guidelines. Significant attributable hazards for cardiovascular and respiratory morbidity are suggested by epidemiological studies, especially for older and paediatric groups. The critical need for integrated mitigation strategies such as switching to cleaner fuels, reducing vehicle emissions, and increasing green cover is highlighted in this article. The results give policy interventions under India’s National Clean Air Programme (NCAP) an evidence-based basis, which is important given Lucknow’s geographic location and population susceptibility.
Hybrid CNN–LSTM Deep Learning Model For Forecasting PM2.5 And PM10 Concentrations In Lucknow, India
Authors: Aditya Verma, Himanshu Ranjan, Manoj Kumar Yadav, Sushant Kumar
Abstract: The Indo-Gangetic Plain of India continues to face a serious environmental and public health problem due to air pollution, as particulate matter (PM2.5 and PM10) continuously surpasses permissible limits. In order to predict particulate matter concentrations in Lucknow, India, this study creates a hybrid Convolutional Neural Network–Long Short-Term Memory (CNN–LSTM) model utilising an eight-year dataset (2017–2024) of meteorological and air quality indicators. To guarantee dependability, the data underwent preprocessing using min–max normalisation, wind vector reconstruction, interpolation, and outlier correction. Through the integration of CNN’s feature extraction and LSTM’s sequential learning, the CNN–LSTM model is able to capture temporal relationships as well as spatial correlations. R2, RMSE, MAE, and MAPE were used to compare performance to standalone models. According to the results, the hybrid method successfully reproduced seasonal variability, including winter peaks and monsoon-driven falls, with the maximum accuracy (R2 = 0.658 for PM2.5; R2 = 0.754 for PM10). The CNN–LSTM outperformed other models in terms of robustness and generalisability, although somewhat underestimating intense episodic surges. Under India’s National Clean Air Programme (NCAP), the results highlight the model’s potential as a decision-support tool for early warning systems and policy actions. The importance of deep learning hybrids for long-term air quality control in heavily polluted metropolitan areas is demonstrated by this work.
DOI: https://doi.org/10.5281/zenodo.17103728
White Wine Pricing A Mathematical Model for Determining Optimal Retail Value Based on Chemical Properties
Authors: Safaan Shawl
Abstract: In an era increasingly dominated by algorithmic precision and data-driven decision-making, the question of whether an artisanal product such as white wine can be priced through a deterministic model seems both audacious and tantalising. This paper embarks on precisely that odyssey—an independent attempt to formulate an original pricing algorithm for white wines by reverse-engineering the latent relationships between their physicochemical properties and their market value. Drawing from publicly available datasets and deploying statistical intuition rather than merely machine learning brute force, this research proposes a novel, human-designed formula that accurately estimates the price of white wines. The formula integrates variables such as acidity, sulphates, residual sugar, and volatile acidity—each weighted with philosophical and economic significance—into a predictive framework that is both interpretable and intuitive. Unlike conventional black-box regressions, the methodology underscores transparency, causal inference, and domain-sensitive calibration. This work is not only a tribute to the enduring relevance of analytical thinking in a machine age but also a call for more interdisciplinary bridges between oenology and economics, chemistry and computation, palate and price. It aims to empower connoisseurs, traders, and vineyards alike to understand, forecast, and perhaps demystify the economics swirling within every bottle. The findings reveal a striking congruence between predicted and actual price tiers, suggesting that white wine pricing, far from being capricious or arbitrary, often adheres to a hidden logic that this paper attempts to uncover and articulate.
DOI: https://doi.org/10.5281/zenodo.17104035
Comparative Review On Self-Healing Concrete Using Bacteria And Crystalline Admixtures
Authors: Mrs. Vandana Rahul Shah
Abstract: Concrete, though the most widely used construction material, is prone to cracking, which compromises durability, service life, and sustainability of structures. Conventional repair methods are temporary, costly, and labor-intensive. In recent years, self-healing concrete has emerged as a promising alternative, capable of autonomously repairing cracks and enhancing long-term performance. This review focuses on two major self-healing approaches: bacterial concrete, where microorganisms such as Bacillus subtilis precipitate calcium carbonate within cracks, and crystalline admixture-based concrete, where chemical additives react with unhydrated cement particles and moisture to form insoluble crystals. A comparative analysis of past studies indicates that bacterial concrete can effectively heal cracks up to 0.8 mm, providing superior strength and durability improvements, though at higher cost. Crystalline admixtures, on the other hand, are economical, commercially available, and suitable for healing micro-cracks up to 0.5 mm, particularly in water-retaining structures. The paper highlights the mechanisms, advantages, limitations, and applications of both approaches, and identifies future research directions including hybrid systems, large-scale field trials, and cost optimization. Findings suggest that self-healing concrete technologies have significant potential to reduce maintenance needs and promote sustainable infrastructure development.
Authors: Podapala Siva Reddy, Ch. V. Radhika, Gadiraju Parvathi
Abstract: Public-private partnerships (PPPs) have changed over centuries, notably since the Roman Empire, to modern forms of infrastructure production and public service delivery. There has been a resurgence of interest beginning in the late twentieth century in seeing PPPs as a way to engage in infrastructure development via viable financing alternatives and efficient risk sharing. This study examined whether PPPs are effective in both mobilizing private-sector capital for infrastructure development, whether they do so through-efficient risk allocation, and also whether this risk allocation model for service delivery yields improved and responsive public service delivery within the context of infrastructure production. The study considered international examples of implementation, the most salient contractual and other governance characteristics of PPPs, and critically examined factors that impact sustainability of PPPs. The study was mixed-methods and underscored how PPPs can enhance infrastructure production/quality and that future research should focus on sectoral frameworks, the socio-economic implications of PPPs for communities, and advancements of the governance framework for PPPs.
Public-Private Partnerships: Catalyzing Sustainable Infrastructure and Service Innovation
Authors: Podapala Siva Reddy, Ch. V. Radhika, Gadiraju Parvathi
Abstract: Public-private partnerships (PPPs) have changed over centuries, notably since the Roman Empire, to modern forms of infrastructure production and public service delivery. There has been a resurgence of interest beginning in the late twentieth century in seeing PPPs as a way to engage in infrastructure development via viable financing alternatives and efficient risk sharing. This study examined whether PPPs are effective in both mobilizing private-sector capital for infrastructure development, whether they do so through-efficient risk allocation, and also whether this risk allocation model for service delivery yields improved and responsive public service delivery within the context of infrastructure production. The study considered international examples of implementation, the most salient contractual and other governance characteristics of PPPs, and critically examined factors that impact sustainability of PPPs. The study was mixed-methods and underscored how PPPs can enhance infrastructure production/quality and that future research should focus on sectoral frameworks, the socio-economic implications of PPPs for communities, and advancements of the governance framework for PPPs.
Adaptive Credit Card Fraud Detection Using Machine Learning And Deep Reinforcement Learning_699
Authors: Sai Rithwik Nooguri
Abstract: Credit card fraud detection is a challenge in the financial sector, where the rarity of fraudulent transactions makes accurate classification particularly difficult. This study presents a comprehensive approach that integrates data preprocessing, resampling techniques, traditional machine learning models, anomaly detection methods, and deep reinforcement learning for effective fraud detection. Initially, extensive exploratory data analysis (EDA) was conducted, followed by handling missing values and applying Synthetic Minority Over-sampling Technique (SMOTE) to address class imbalance. A variety of supervised models, including Logistic Regression, Random Forest, XGBoost, and Multi-Layer Perceptron (MLP), as well as unsupervised anomaly detection methods like Isolation Forest and Local Outlier Factor, were evaluated. Subsequently, a Deep Q-Learning Network (DQN) was implemented to model fraud detection as a sequential decision-making problem, allowing the system to dynamically learn fraud patterns. The experimental results demonstrate that DQN achieved high precision, recall, and F1- score, outperforming several traditional classifiers. This study highlights the importance of combining classical and modern learning paradigms to enhance information assurance in credit card transaction systems. The code supports reproducibility and future research.
DOI: https://doi.org/10.5281/zenodo.17119185
Socio-Economic-Factors Affecting Fresh Tomato Marketing In Kitgum Main Market, Uganda.
Authors: Denish Ocira, Edward Ssemakula
Abstract: The continuous rise of urbanization has led to an overwhelming increase in waste generation with serious consequences for the environment and humans. Most waste disposal methods are inefficient, with little accountability or participation from the community, hence we propose a Smart Waste Management System (SWMS) built on AI technologies that employs computer vision and cloud computing to track on a real-time basis, facilitating improved waste sorting and the complaint making towards upcycling. The system allows the community to upload pictures of items to be reused and are identified as categories using an artificial intelligence model through which there is a trgging of the item for appropriate action. The platform also enables conversations on tracking complaints and donations of reusable items, thereby enabling data emergence for urban waste management authorities in making decisions. This paper explains the system design and implementation and is sustainability implications.
DOI: https://doi.org/10.5281/zenodo.17119882
Experimental Study On The Properties Of Concrete Using Marble Powder And Steel Fibres As Partial Cement Replacement
Authors: Deepak Kumar Mishra
Abstract: Concrete, a fundamental material in construction, is increasingly being modified to incorporate sustainable alternatives that enhance performance while minimizing environmental impact. This study investigates the effects of partially replacing cement with marble powder and adding steel fibres in varying proportions (0%, 0.5%, 1%, 1.5%, and 2.0%) on the mechanical properties of M25 grade concrete. Results show that a mix containing 15% marble powder and 1% steel fibre achieves optimal compressive, split tensile, and flexural strength at 28 days. The marble powder improves workability due to its smooth texture and spherical shape, while the addition of steel fibre, though reducing workability, enhances bonding and overall strength. The findings suggest that the combination of marble powder and steel fibres can be effectively used in structural applications such as multistoried buildings and bridges. A recommended optimal mix of 15% marble powder and 1% steel fibre offers the best performance, though further long-term studies are advised to assess durability and field performance.
Water Pollution In Two Canals Across The Ajay River Due To Coal Mining: A Seasonal Analysis
Authors: Dr. Sanjay Kumar Singh, Mr. Sujeet Kumar, Dr. Niranjan Kumar Mandal
Abstract: Coal mining activities significantly contribute to the degradation of water quality, especially in areas close to mining operations. This study examines the water quality in two different canals across the Ajay River, assessing seasonal variations in physicochemical parameters and heavy metal concentrations during the rainy, winter, and summer seasons. Parameters such as pH, turbidity, conductance, hardness, alkalinity, total solids, and concentrations of heavy metals including arsenic, iron, zinc, and others were evaluated. Results indicate that water pollution fluctuates seasonally, with the highest contamination observed in the rainy season. These findings underscore the need for continuous monitoring and effective water management strategies to mitigate the adverse effects of coal mining on water quality.
DOI: https://doi.org/10.5281/zenodo.17120411
Development Of Acylated Pyrazole-Containing Heterocyclic Chalcones: Synthesis, Spectral Studies, And Antibacterial Assessment
Authors: Ranjan Kumar, Niranjan Kumar Mandala, Poonam Kumaria
Abstract: In this study, a novel series of 1-[3-(4-fluoro-3-methylphenyl)-5-phenyl-4,5-dihydro-1H-pyrazol-1-yl]ethan-1-one derivatives (3a-i) were synthesized, and their chemical structures were studied by 1H NMR, IR, and mass spectroscopy. TLC was used to examine the products that were isolated to determine their level of purity. The results of this study show that these derivatives have interesting properties. The disc diffusion method was used to test the in vitro antimicrobial activity of the synthesized compounds against Escherichia coli (MCC 2412), Staphylococcus aureus (MCC 2408), Bacillus subtilis (MCC 2010), Pseudomonas aeruginosa (MCC 2080), Saccharomyces cerevisiae (MCC 1033), and Candida albicans (MCC 1439).
DOI: https://doi.org/10.5281/zenodo.17120673
Synthesis, Spectroscopic Characterization, And Biological Evaluation Of (2E,3Z)-3-(((E)-2,4-dichlorobenzylidene)hydrazono)butan-2-one Oxime And Its Fe(II), Cu(II), Co(II), Ni(II), And Mn(II) Complexes
Authors: C. S. Azad, Prateek Mohan Mishra, Niranjan Kumar Mandal
Abstract: A novel Schiff base ligand, (2E,3Z)-3-(((E)-2,3-dichlorobenzylidene)hydrazono)butan-2-one oxime, was synthesized and characterized, along with its coordination complexes with Mn(II), Fe(II), Cu(II), Zn(II), Cd(II), Hg(II), Ni(II), and Pd(II) ions. The ligand and its metal complexes were characterized using various spectroscopic techniques, including FTIR, NMR, UV-Vis, and elemental analysis, to confirm their structural integrity and coordination behavior. The antibacterial and antifungal activities of the ligand and its complexes were evaluated against a range of pathogenic bacterial strains (e.g., Escherichia coli, Staphylococcus aureus) and fungal strains (e.g., Candida albicans, Aspergillus niger). The metal complexes exhibited enhanced antimicrobial activity compared to the free ligand, with the Cu(II) and Ni(II) complexes showing particularly potent inhibitory effects. The structure-activity relationship suggests that the coordination of metal ions enhances the lipophilicity and interaction with microbial cell membranes, thereby improving bioactivity. These findings suggest potential applications of these complexes in the development of new antimicrobial agents.
DOI: https://doi.org/10.5281/zenodo.17120985
Synthesis, Characterization, And Bioactivity Evaluation Of (E)-N-(2-Fluoro-6-hydroxybenzylidene)-4-methoxybenzohydrazide And Its Ni(II), Co(II), Cu(II), Pd(II), Mn(II), And Fe(II) Complexes
Authors: Ranjan Kumar, Vishal Kumar, Niranjan Kumar Mandal
Abstract: This study reports the successful synthesis and comprehensive characterization of a novel hydrazone ligand, (E)-N’-(2-fluoro-6-hydroxybenzylidene)-4-methoxybenzohydrazide, along with its coordination complexes with nickel(II), cobalt(II), copper(II), palladium(II), manganese(II), and iron(II) ions. Structural elucidation was achieved through spectroscopic techniques including FT-IR, 1H and 13C NMR, and mass spectrometry, confirming ligand coordination and complex formation. The antibacterial and antifungal activities of the free ligand and its metal complexes were evaluated against a range of pathogenic strains. The complexes exhibited enhanced antimicrobial efficacy relative to the parent ligand, with significant inhibitory zones observed against Gram-positive and Gram-negative bacteria, as well as fungal species. These findings suggest that metal complexation improves bioactivity, indicative of potential applications in therapeutic agent development. This work contributes to the growing field of bioinorganic chemistry by providing insight into structure-activity relationships of hydrazone-metal complexes as promising antimicrobial candidates.
Robotic Arms As Cognitive Tools For Designing Extraterrestrial Architecture
Authors: Azar Djamali
Abstract: In his essay “The Future Isn’t What It Used To Be,” Victor Papanek critiques the prevailing drive to systematise design, arguing that an over-reliance on scientific predictability has led to a critical disconnection from fundamental human sensory responses to natural environmental conditions (Papanek 1995, cited in Margolin and Buchanan 1995). He observes that modern, hermetically sealed interiors—products of post-war development—have subjected inhabitants to a prolonged experiment in artificial living, severing vital connections to atmospheric phenomena like natural light and air. This intellectual foundation establishes an urgent imperative for design: to take conscious responsibility for manufactured environments that support rather than damage human health and performance. Within this critical framework, this paper considers whether robotic arms can serve as tools for thinking, assisting architects in reimagining the architectural design process for extraterrestrial habitats on the Moon and Mars, where creating viable sensory environments constitutes a fundamental prerequisite for survival rather than merely an aesthetic concern. This article envisions a future where architects employ robotic arms as cognitive tools in the design process, transforming creative efforts into an interactive blend of ideas and physical actions. It highlights how these robotic systems can extend human thinking capabilities, enabling architects to visualise and manipulate designs in previously impossible ways. Research synthesized from over 100 papers reveals that robotic arms provide immediate feedback during design processes, allowing architects to explore multiple concepts simultaneously and develop innovative solutions for extraterrestrial habitats. For example, when designing a structure on Mars, architects can use robotic arms to experiment with various materials and configurations, refining ideas in real time. A pertinent real-world example is the “Mars Ice Home” concept designed by the firm SEArch+ (Space Exploration Architecture) for NASA. This project exemplifies the principles of habitability and in-situ resource utilisation, proposing a radiation-shielded, pressurised habitat constructed from Martian water-ice. The architects at SEArch+ prioritised the psychological well-being of inhabitants by designing a layered, light-filtering ice shell to create a connection to the external Martian environment, directly addressing Papanek’s critique of sensory-disconnected interiors (SEArch+ 2021). This cognitive collaboration enhances problem-solving capabilities and encourages architects to expand creative boundaries. However, a significant gap remains in understanding how to fully integrate robots as cognitive and creative partners in architecture. Further research is needed to explore human-robot interaction dynamics and optimise these relationships for design processes. By embracing robotic arms as thinking partners, architects can optimise resource utilisation and develop new approaches to architectural challenges, paving the way for advancements in extraterrestrial living.
Long-Term Social And Economic Effects Of AI In Indian FinTech: A Quantitative Survey Approach
Authors: Theertha Prasad K
Abstract: – This study investigates the long-term socioeconomic consequences of Artificial Intelligence (AI) adoption in the Indian FinTech sector through a quantitative survey approach. The primary objectives were to assess economic effects such as productivity gains, employment impacts, and cost efficiency; evaluate social outcomes including financial inclusion, consumer trust, digital equity, and user satisfaction; capture stakeholder perceptions of opportunities and risks; and examine whether AI contributes to inclusive growth or reinforces divides. Data were collected from 412 FinTech professionals across India’s major financial hubs—Bengaluru, Mumbai, Hyderabad, and Delhi-NCR—using a structured questionnaire comprising 28 items. Stratified random sampling ensured representation from startups, mid-sized firms, and large enterprises. Responses were analyzed using SPSS v.28, applying descriptive statistics, correlation, and regression analysis. The results revealed strong positive economic impacts, with professionals noting significant gains in productivity, decision-making speed, and profitability, though employment effects were mixed. Socially, AI was perceived to advance financial inclusion and customer satisfaction, while concerns remained regarding digital divides, privacy, and trust. Regression results showed that stakeholder perceptions were the strongest predictor of inclusive growth, highlighting the decisive role of professional views in shaping sustainable AI adoption. The findings contribute to filling the literature gap by quantifying interconnected economic, social, and perceptual outcomes, offering critical implications for policymakers, practitioners, and academics in ensuring AI fosters equitable and inclusive development in India’s FinTech ecosystem.
DOI: http://doi.org/
QR Based Smart Restaurant Ordering
Authors: Dev Tilak, Divyakshi Sharma, Mayuri Umaretiya, Neel Keshruwala
Abstract: Restaurants nowadays require more than technology that improves efficiency—ought to be actively enhancing people’s dining experience. Towards facilitating this, we propose a Smart Restaurant Ordering System developed with the MERN stack, where customers can use special, table-numbered QR codes to instantly view and interact with the electronic menu [1], [2]. It reduces wait time, streamlines the process of ordering, and reduces unwanted physical contact [6]. The most accessible of the system’s functions is its allergy filter, which allows consumers to rule out ingredients and dine with confidence, especially the health-conscious or dietary-restricted [3], [5]. This is consistent with a shifting trend in consumer demand towards individualized food choice and increased dietary prominence [7], [11]. In addition to its current capabilities, the platform is designed to be scalable for further development and incorporate future features like AI-powered food suggestion, personalized dietary guidance, and functional aspects like secure online payment processing and kitchen connectivity in real-time [8]–[10], [12]. By prioritizing safety, personalization, and user control, this research sets a robust and scalable platform for further generations of smart dining technologies that not only scale up operations but also deepen customer experience [4]. Index Terms—Smart Restaurant System, QR Code Ordering, Food Recommender System, MERN Stack, Allergy Filter, Secure Payments.
Automated Timetable Generator
Authors: Kevan Tamboli, Manmeetsinh Mandora, Heval Shah, Aryan Maheshwari
Abstract: Academic timetable creation is a constraint-rich and repetitive task that is often handled manually, making it slow, resource-intensive, and prone to errors. This paper presents an Automated Timetable Generator designed with a hybrid architecture that combines a a React-based web frontend with Tailwind CSS, a Node.js middleware for orchestration and persistence, and a Python (FastAPI) microservice using Pandas/NumPy for tabular processing and Pydantic for model validation.We present a fully implemented Automated Timetable Gen- erator that ingests four structured spreadsheets (Teachers, Sub- jects, Rooms, FixedSlots) and produces conflict-free teacher-wise, batch-wise, room-wise schedules and exportable to CSV/Excel for sharing.The system enforces hard constraints such as preventing teacher or room double-bookings, matching course type to room type, enforcing workload limits, and respecting pre-assigned slots/fixed slots. It also accommodates soft constraints like teacher subject preferences, shift preferences, and designation- aware fairness.Experiments on departmental-scale datasets show zero hard- constraint violations and sub-second runtimes, indicating that transparent rule-based scheduling can reliably automate institu- tional timetabling without metaheuristics or specialized solvers.
Reinforcing Monitoring System For River Using ML
Authors: K Lakshma Reddy, Srinath Khemkar, T Yogeshwar, Tammali Naresh
Abstract: Floods are a major natural disaster that can cause and coastal regions. The system is also widespread damage and loss of life. Machine scalable and can be easily adapted to new learning (ML)is a powerful tool that can be used t locations improve the accuracy of flood forecasting • Th paper concludes with a discussion of the and warning systems. This challenges and limitations of the proposed research paper presents a novel ML-based system. The authority flood forecasting and warning system. Als discuss the potential for future research in • The proposed system uses a combination of this area. different ML algorithms to predict the PR likelihood and severity of flood- ing. The algorithms are trained on historical data on rainfall, river levels, and other factors that can contribute to flooding. The system is also able to take into account real-time data, such as current rainfall and Designing a flood detection system using machine learning (ML) involves utilizing historical data to train models that can identify patterns indicative of flooding. Here’s proposed system architecture for flood detection using ML. river levels. • The proposed system was evaluated using dataset of historical flood events. The results showed that the system was able to accurately predict the likelihood and severity of flooding. The system was also able to generate timely warnings. which can help to save lives and reduce property damage. The proposed system has the potential to be used in a variety of settings, including river basins, urban areas,
Evaluating The Impact Of Rainwater Harvesting Systems For Sustainability In Multi National Organizations.
Authors: Dr.Reshma Nair, Mr.Protyush De
Abstract: The world has paved its way to project the issue of water scarcity in its various aspects. Multinational corporations are a victim to a abundant amount of water wastage every day. Therefore adopting sustainable practices in terms of rainwater harvesting, purification followed by reusal and generation of micro power to run mini devices like wireless charging system and so on that can be of great help to save water and applying hydro-logical power to save nonrenewable resources. This research explores the innovative application of bio- mimicry in rainwater harvesting, purification systems and installation of mini turbines, as a pathway for multinationals to reduce water dependency, promote environmental sustainability, and enhance operational resilience. Bio-mimicry, or the design approach that draws inspiration from natural processes and organisms, provides a robust framework for optimizing water capture, storage, filtration and power generation. Key measures discussed include the use of lotus leaf-inspired hydrophobic surfaces to maximize water collection, Baobab tree-inspired modular storage systems to sustain water availability, mangrove-root filtration techniques to enhance purification and installation of mini turbines to generate electricity. Additionally, the study examines fog and dew harvesting methods along with harvesting of dripping water from coolant pipes of condenser associated with air conditioners. By implementing these nature-inspired systems and closed loop automated turbine, multinationals can significantly lower water consumption, promote aquifer recharge and generate little bit of power to reduce pollution, aligning with their environmental, social, and governance (ESG) commitments. This research underscores bio-mimicry’s potential to reshape sustainable water practices within corporate structures and mini turbines to generate electricity, providing a scalable model for coherent and Eco-Philic practices in water management in diverse climates and operational contexts.
Advanced Energy Management And Power Quality Enhancement In DC Micro-grids With EV Fast Charging Using ANN-Controlled STATCOM
Authors: Hachimenum Nyebuchi Amadi, Henry Okechukwu Williams, Richeal Chinaeche Ijeoma
Abstract: The rapid integration of electric vehicle (EV) fast charging stations in DC micro-grids has introduced significant power quality challenges, particularly harmonic current distortion at the point of common coupling (PCC). In this study, a DC microgrid integrating photovoltaic (PV) generation, battery energy storage systems (BESS), and a Level-3 EV fast charging station was modeled in MATLAB/Simulink to examine the effect of harmonic distortion and evaluate mitigation using an Artificial Neural Network (ANN)-controlled Static Synchronous Compensator (STATCOM). Base case simulation results revealed that the EV fast charging station injected excessive harmonic distortion into the network, with dominant odd harmonics at the 11th and 13th orders, leading to a total harmonic distortion (THD) of 14.05%. This value significantly exceeds the IEEE 519-2022 standard limit of 8% for medium-voltage systems. Following the installation of an ANN-tuned STATCOM at the PCC, the harmonic distortion was substantially mitigated, reducing the 11th and 13th orders to 0.01% and 0.15% respectively. Consequently, the total harmonic distortion was minimized to 1.23%, achieving a 91.24% reduction and ensuring full compliance with IEEE standards. Furthermore, the ANN controller demonstrated excellent training performance with a best validation mean square error of 0.0034611 at epoch 20 and a regression correlation coefficient of R = 0.9879, validating its accuracy and robustness. These findings confirm that ANN-controlled STATCOM provides an effective and intelligent solution for enhancing power quality and system stability in DC micro-grids with EV fast charging integration.
DOI: http://doi.org/
Advanced Power Factor Correction for Distribution Efficiency Enhancement: The Case of Port Harcourt Mainstream 33 kV Distribution Network
Authors: Hachimenum Nyebuchi Amadi, Happy Prince Nwokoegi, Richeal Chinaeche Ijeoma
Abstract: Distribution efficiency in developing power systems is often undermined by excessive reactive power demand, poor voltage regulation, and high technical losses. The Port Harcourt mainstream 33 kV distribution network in Nigeria, a critical urban supply corridor, is particularly vulnerable to these inefficiencies due to its radial structure, overloaded transformers, and weak reactive power support. Such conditions result in low power factor, under voltage problems, and distribution losses that exceed international performance standards, thereby threatening supply reliability and quality of service. In this study, the Port Harcourt 33 kV distribution network was modeled in MATLAB/Simulink to evaluate its operational performance and investigate the effectiveness of advanced power factor correction (PFC) using a Distribution Static Synchronous Compensator (D-STATCOM). Baseline simulations revealed progressive voltage deterioration along the feeder, with the weakest bus falling to 0.910 p.u., well below the operational limit of 0.95 p.u. Furthermore, the system recorded active power losses of 1.604 MW, equivalent to 9.5% of peak demand, substantially higher than the 2–6% technical loss benchmark recommended by IEEE for efficient distribution systems. Following the integration of D-STATCOM into the network, remarkable improvements were observed. All bus voltages were restored within 0.989-0.999 p.u., with the weakest bus improved from 0.910 p.u. to 0.989 p.u., thereby ensuring compliance with the 0.95-1.05 p.u. standard. In addition, total technical losses decreased sharply to 0.283 MW, equivalent to 2.0% of peak demand, placing the network well within international best-practice thresholds. The findings confirm D-STATCOM as an effective and sustainable solution for improving voltage stability, minimizing technical losses, and enhancing reliability in urban distribution networks.
Moldex3D-Based Simulation in Injection Molding: A Review of Flow, Cooling, Warpage, and Defect Prediction
Authors: Shani Singh
Abstract: Injection molding remains the dominant process for high-volume plastic production, but its strong sensitivity to process parameters, material behaviour, and cooling efficiency makes traditional trial-and-error optimization costly and slow. Moldex3D, a dedicated 3D CAE package for polymer processing, enables detailed simulation of filling, packing, cooling, and warpage, allowing defects such as short shot, weld lines, air traps, sink marks, voids, and deformation to be predicted before tool manufacture. By combining non-Newtonian flow models, temperature-dependent material data, and realistic mold and cooling layouts, Moldex3D helps designers and process engineers optimize gate locations, runner balance, packing profiles, and conventional or conformal cooling channel designs. This review consolidates recent Moldex3D-based research across automotive, consumer, electronic, and medical applications, with emphasis on flow and shrinkage analysis, warpage prediction in fibre-reinforced parts, and use in multi-cavity and thin-walled molds. Advanced and hybrid workflows are also examined, including integration with CAD/CAE platforms, topology and DOE-based optimization, and transfer of fibre orientation and residual stress fields into structural FEA. Key limitations are identified in material modeling (PVT, viscosity, fibre orientation), mesh and computation cost for full 3D models, and incomplete coupling to structural durability analysis and real machine behaviour. Finally, the review highlights future opportunities for AI-assisted optimization, cloud-based simulation, and digital twin integration, positioning Moldex3D as a core enabler of simulation-driven, intelligent injection molding.
DOI: https://doi.org/10.5281/zenodo.18494341
Using Machine Learning For Cross-Crop Nitrogen Deficiency Detection In Crops
Authors: Ravi Prakash Jaiswal, Manish Saraf, Vijendra Pratap Singh, Ambuj Kumar Misra
Abstract: Nitrogen (N) deficiency remains a major constraint on cereal productivity because it reduces chlorophyll formation, canopy photosynthesis, and grain filling, while blanket fertilizer practices often fail to match within-field variability and reduce nitrogen use efficiency (NUE) (Govindasamy et al., 2023). Although destructive sampling and laboratory diagnostics are accurate, they are slow and difficult to scale for timely, spatially targeted decisions in real farms (Fu et al., 2021). This study frames N deficiency detection as a cross-domain transfer learning problem and develops a cross-crop machine learning framework for wheat, maize, and rice using RGB imagery under field conditions. We harmonized and profiled three public datasets (wheat: 1,381 leaf images; maize: 1,200 canopy/plot images; rice: 1,500 leaf images with Leaf Color Chart-based labeling), applied standardized preprocessing, and trained baseline CNN and fine-tuned ResNet models with fixed random seeds and identical train/validation/test splits for reproducibility. Performance was evaluated under three scenarios: within-crop testing, direct cross-crop transfer without retraining, and domain adaptation using unlabeled target data. Four adaptation methods were benchmarked: CORAL, MMD, AdaBN, and Domain-Adversarial Neural Networks (DANN) (Ganin et al., 2016; Gretton et al., 2012; Li et al., 2016; Sun & Saenko, 2016). Baseline cross-crop transfer showed substantial generalization gaps (≈25–35 percentage points), with accuracy ranging from 47.6% to 56.2% across crop pairs, confirming severe domain shift (Fu et al., 2021; Pan & Yang, 2010). Domain adaptation improved average cross-crop accuracy from 51.7% (baseline) to 58.3% (AdaBN), 60.1% (CORAL), 64.6% (MMD), and 73.2% (DANN), with DANN delivering up to ~19% absolute improvement and the most consistent gains under challenging transfers (Ganin et al., 2016). Overall, results indicate that adversarial domain adaptation can substantially reduce cross-crop failure modes and supports more scalable nitrogen monitoring with reduced dependence on crop-specific labels, while practical deployment should include agronomic guardrails and uncertainty-aware decision rules for safe in-season recommendations (Fu et al., 2021).