IJSRET » July 8, 2025

Daily Archives: July 8, 2025

Uncategorized

IJSRET Editorial Board Member Lakshmi Kalyani Chinthala

 

Lakshmi Kalyani Chinthala

Affilation:

Strategic Planning Program Manager

San Francisco California

Email-Id: chinthalakalyani01@gmail.com
Publication:

  • Chinthala, L. K. (2021). Future of supply chains: Trends in automation, globalization, and sustainability. International Journal of Scientific Research & Engineering Trends, 7(6), 1-10.
  • Chinthala, L. K. (2021). Diversity and inclusion: The business case for building more equitable organizations. Journal of Management and Science, 11(4), 85-87.
  • Chinthala, L. K. (2021). Business in the Metaverse: Exploring the future of virtual reality and digital interaction. International Journal of Science, Engineering and Technology, 9(6). ISSN (Online): 2348-4098.
  • Chinthala, L. K. (2025). Consumer experience 2025: The role of personalization and AI in shaping business strategies. International Journal of Modern Science and Research Technology, 3(5), 21–27.
  • Chinthala, L. K. (2025, February). Artificial intelligence in business strategy: Enhancing competitive advantage through machine learning. International Journal of Modern Science and Research Technology, 3(2), 83–90.

 

Published by:
Uncategorized

Intelligent Detection of Mobile SMS Spam via Machine Learning and Deep Learning

Authors: Megha Birthare, Neelesh Jain

Abstract: The global rise in social media usage has led to a surge in unwanted bulk SMS, necessitating the development of an effective system to filter out these messages. The most prevalent issue on the internet is spam text messages. Sending a spam-filled SMS is a straightforward task for spammers. Spammers are able to take valuable data, including contacts and files, from our devices. In recent years, several word embedding techniques leveraging deep learning have been developed. These advancements in word representation could offer a reliable remedy for these problems. This study will look at a technique that employs natural language processing to distinguish among spam and ham texts utilizing the SMS Spam Collection Dataset from the UCI Machine Learning Repository. We compared the accuracy and outcomes of using the Bi-LSTM and LSTM. The effectiveness of the dataset is assessed using measures like F1-score, recall, and accuracy. The study demonstrates that the dataset's overall accuracy increases when Bi-LSTM classification is used. Python is used for all work, and a Jupyter notebook is used for implementation.

DOI: https://doi.org/10.61137/ijsret.vol.11.issue4.114

Published by:
Uncategorized

Enhancing Student Performance PredictIon Using Random Forest And Feature Engineering Algorithms In Machine Learning

Authors: Ms.T. Nandhini Supervisor Assistant Professor, M.Mugesh Kumar, M.Snekan, R.Tamilselvan

Abstract: This project presents a machine learning-based methodology for student performance prediction with Random Forest and Feature Engineering. Academic institutions are becoming more dependent on data-driven intelligence to enhance educational planning and student support. Conventional models usually do not capture various student characteristics like demographic, academic, and behavioral features, thus restricting predictive capabilities. In this paper, we overcome these limitations by suggesting an ensemble learning approach with sophisticated feature engineering to enhance the interpretability and flexibility of the prediction process. The Random Forest classifier is employed due to its high accuracy and stability, and the model is assessed using metrics like accuracy, precision, recall, and F1-score. Experimental results indicate that the new system surpasses conventional AFSA-based models in detecting at-risk students, allowing for early intervention approaches to improve academic performance.

DOI:

 

Published by:
Uncategorized

Design of Flexible Pavement with Maximum Utilization of Industrial Waste

Authors: G.Srikanth, Assistant Professor M.Harish Kumar

Abstract: Since the subgrade is the bedrock upon which the pavement model is built, its behaviour is crucial to the design of flexible road projects. Therefore, pavement performance and design analysis need a considerable deal of attention. The expansive soils include a high concentration of clay and silt, hence it is necessary to stabilise or compact the soil subgrade prior to building a flexible pavement. Subgrade soil replacement is the method of choice for stabilising certain types of soils. Fly ash, an industrial waste product, is currently being used in around 55% of its total capacity. Substituting bitumen-coated chicken mesh with fly ash in subgrade at varying percentages and layering the mesh appropriately will make good use of the fly ash. A better California bearing ratio can be achieved by using bitumen-coated chicken mesh as extra reinforcement. This study utilises the CBR mould to impact fly ash samples till they reach their maximum dry density and optimal moisture content. The samples are then used with or without bitumen coated bamboo meshes and chicken mesh layers. Dimensions of the chicken mesh match those of the CBR mould in plan view. It was thereafter covered with chicken mesh after being placed in the preparations of 1, 2, 3, and 4 layers. The laboratory tests show that the mixture with four layers of bitumen-coated chicken mesh and a 15% substitution of fly ash has the highest CBR value, indicating that it is very strong. The soil's moisture content is lowered by using plastic garbage as a partial replacement. In this study, various percentages of plastic garbage are introduced to soil in order to quantify and compare the values of UCS, CBR, MDD, and OMC with the unmodified soil.

Published by:
Uncategorized

Performance Analysis of Two Potential Indus Waste Materials Fa & Lsludge

Authors: B.Lavanya, Assistant Professor M.Harish Kumar

Abstract: This study stabilized two potentially hazardous industrial waste materials—Fly Ash (FA) and Lime Sludge (LS)—using Commercial Lime (CL) and Gypsum (G), respectively. The FA and LS come in large quantities from thermal power plants and water treatment plants, and they pose environmental hazards. The goal was to make the sludge and FA suitable for use in Civil Engineering construction applications. Tests for Unconfined Compressive Strength (UCS), California Bearing Ratio (CBR), and Split Tensile Strength Test (STS) were conducted on 39 different combinations of FA, LS, CL, and G. Using these metrics, we were able to determine that two composites stabilized with 12%CL and 1%G performed admirably: (optimum mix 2, 50%FA+50%LS) and (optimum mix 1,95%FA+5%LS). After 28 days of curing, the UCS for optimal mix 1 was 6.6 MPa, while for optimum mix 2 it was 5.8 MPa, and for STS it was 1.3 MPa, and for both mixes it was 1.1 MPa. After 28 days of curing, the optimal mix 1 had a soaked CBR value of 75% and an unsoaked value of 89%. When optimal mix 2 was cured for 28 days, the CBR values were 91% for the unsoaked condition and 82% for the soaked condition. When tested for durability according to both British and American standards, both composites passed with flying colors. For instance, following twelve cycles of wetting and drying, mix 2 samples lost 1.05% of their initial weight while mix 1 samples only lost 1.12% of their initial weight, both of which were well within the criteria that were recommended. Also, after 28 days of curing, the composites had almost little heavy metal content leaching out. Additionally, scanning electron microscopy (SEM) and X-ray diffraction (XRD) were used to study the development of cementitious compounds during curing. Having said that, the stabilized composites exhibited brittleness. The ductility and strength were examined after adding fibers to these composites in an effort to reduce their brittleness. The results of the trials with the various fiber percentages showed that adding 0.3% of fiber to both composites increased the mix's strength and ductility. The strength index and deformability index were used to measure the improvement in ductility and strength, respectively. There is an 80% increase in failure strain and a 40% increase in strength when fibers are added. The findings indicated that using Composites have achieved the necessary strength, durability, and ductility to be used as base course layer materials in flexible pavements. Furthermore, it is recommended that the total pavement thickness be decided upon based on dependability, taking into account the uncertainty in the input data such as the design traffic load (measured in Million Standard Axle, MSA) and soil carrying capacity (measured in CBR value). In order to determine the total thickness of the flexible pavement that can guarantee a specific degree of dependability in the pavement's performance, design charts were produced, taking into account the uncertainty in the input data. In addition, the reliability-based technique—which combines the Mechanistic- Empirical approach with Monte Carlo Simulation (MCS) and the First Order Reliability Method—was used to study the uncertainties in distress analysis of flexible pavement, specifically with regard to fatigue and rutting failure. In order to determine the pavement's performance, the reliability index (β) was calculated and strain at crucial sites for fatigue and rutting failure was calculated using PLAXIS 2D coding software on a three-layered flexible pavement model. The specified strength, durability, and ductility requirements are met by the produced composites, namely Optimal Mix 1 [(95%FA+5%LS), 12% CL, 1% G and 0.3% F] and Optimal Mix 2 [(50%FA+50%LS), 12% CL, 1% G and 0.3% F]. The compressive strain value at the crucial point was found to be ɛc = 13.42 x 10-5 and the tensile strain value at the same region was found to be ɛt = 18.11 x 10-5 for a 450 mm thick foundation of optimal mix 2. For 10 MSA of design traffic load, the pavement performed admirably (βR and βF >5) according to the probabilistic based methodology. However, when subjected to larger traffic loads, the pavement exhibited above-average performance in terms of fatigue and rutting.

Published by:
Uncategorized

Analysis on Modelling of Traffic Pollution to Biomonitor Traffic Police

Authors: MD.Afroz, Assistant Professor B.Narsimha

Abstract: In Indian cities, air pollution is a major problem that affects both the environment and people's health. The traffic, congestion, and pollution levels of Mumbai, one of the most densely populated metropolises in the world, have been steadily rising in recent years. When it comes to city traffic, the top 20 locations are well-known for their extreme congestion. These areas are known as "hotspots" because of the large concentration of vehicles at certain intersections around the city. At the morning's peak hour (from 7:30 to 9:00 am) and the evening's peak hour (from 6:30 to 7:30 pm), cars travel at an average speed of 10 kmph and 8 kmph, respectively. Records show that evenings are the most polluting and crowded times of day. Congestion in the roadways exposes the general population to air and noise pollution. Central Pollution Control Board- specified noise levels are exceeded at the intersections, according to recorded data. The noise levels at the intersections are over 90 dB on average, even though they should be below 75 dB for industrial and 65 dB for commercial settings. 55 decibels for homes and 50 decibels for quiet zones. Vehicle emissions include a wide range of contaminants, including suspended particle matter (SPM), oxides of sulfur (SOx), PAN, and nitrogen oxides (NOx). Any illness affecting the respiratory system is caused by SPM. The research here takes RSPM 2.5 (respirable suspended particulate matter) into account. At 20 different hotspots, we assessed noise levels, RSPM 2.5, number of cars, and congestion at different times of the day and season. The aforementioned parameters are averaged and their standard deviations are computed. In addition, CalRoads View Software is used for data analysis and modeling. Predicting pollution concentrations near intersections where cars idle in waiting for signals is the job of CALINE 3. This program was useful for figuring out how pollutants were distributed at the intersections. Congestion areas and intersections were the sites of the surveys. The poll was conducted via a questionnaire and was open to retailers, street vendors, and traffic police. Breathing and respiratory issues affected the majority of those who participated in the poll. Many of them have had asthma diagnoses at some point. The exposed individuals showed signs of both acute and chronic illnesses. Lots of people who took the poll reported having an illness of the respiratory system, such as sinusitis, tonsillitis, bronchitis, pneumonia, etc. Researchers believe that traffic officers are the best demographic to investigate because of their heightened sensitivity to the effects of vehicle emissions on human systems, particularly the respiratory system and hearing impairments. Participants in the study will be traffic officials tasked with administrative tasks, who will serve as the control group, while traffic officials stationed at different intersections will serve as the exposed group. Individuals working in traffic were tested for noise-induced hearing loss (NIHL). The results clearly show that traffic cops have a greater risk of hearing loss. Unbelievably, 57.14% of traffic cops who have been on the job for more than 25 years have had serious injuries from NIHL. The current investigation did not find any cases of genetic or familial deafness or ear sickness among the participants. The prevalence of asthma was also assessed in other traffic staff. Asthma prevalence among traffic workers can be diagnosed using spirometry analysis. To find the maximum expiratory flow rate in liters per minute, one uses the MINISPIR S/N TO 1795. Asthma diagnosis incidence among traffic police officers aged 18–58 years old (FVC). The ages of the diagnosed traffic professionals range from zero to five, with exposure years in the upper limit. The PEFR is measured in both the exposed and control groups of traffic cops. Approximately eight hours of each workday expose traffic workers directly to particulate matter. Diagnosis ranges from less than five years to over twenty-five years of experience for traffic police. Rejection from the analysis was based on the fact that the traffic police were actively pursuing a smoking habit. The exposed and control groups were both tested for diseases. More than 35% more people in this group have asthma than in the control group. According to the data, there is also an increase in the number of cases of asthma. The effects of air pollution on traffic workers can be mitigated by the development and testing of protective gear. Surgical masks and handkerchiefs are examples of compensatory activities people take in response to pollution threats in order to lower their actual exposures. The research set out to develop a study mask capable of preventing exposure to potentially harmful particles such as dust, allergies, and infectious aerosols. Gradually, further features are included to minimize noise and dust particles that might potentially harm the eyes. We created and tested stage-wise masks. Nose piece with inhaler, exhaust fan near the section of the mouth or nose, lightweight, simple to wear, adaptable to all face sizes and shapes, low maintenance, aesthetically pleasing, and aesthetically pleasing are all features and components that were combined. To determine how effective the study mask was, it was compared to other masks on the market and put through its paces. But how well current face masks protect is anybody's guess. The purpose was to find out how well face masks protect the respiratory system against RSPM levels that might be hazardous. After subtracting the SPM particles from the total air volume inhaled in one minute, the readings on the muslin fabric are reported. Additional issues caused by RSPM are mitigated to a certain degree. The average number of particles collected in one minute is greater than 0.1 gm/min. Not only does the mask shield you from harmful RSPM levels and noise, but it also has an exhaust fan that draws carbon dioxide out of your mouth and keeps you well-ventilated. With this mask, you won't have to worry about the usual issues of sweating or itching while wearing it. Stages 1 and 2 of the mask's modification process were carried out utilizing 3D printer technology. The mask underwent both quantitative and qualitative testing in accordance with OSHA regulations, and test methods were prepared for submission to the ethics committee. All things considered, the outcomes are satisfactory. Additional improvements in terms of size variations, compatibility with additional safety gear like helmets, etc., will allow the device to be released to the public for usage.

Published by:
Uncategorized

Analysis on Knowledge Based Prediction System for Road Traffic Congestion

Authors: S.Likhitha Priya, Assistant Professor B.Narsimha

Abstract: The transportation networks, economy, and quality of life in cities throughout the world are all negatively impacted by the pervasive problem of traffic congestion. Congestion relief is of critical importance in India since the country's fast urbanization and population increase are making the situation worse. By integrating user and expert perspectives and using a data-driven approach to prepare a knowledge base for traffic congestion, this thesis presents a holistic strategy to address road traffic congestion. The system is tailored to the lndian scenario in the context of inter-urban highways. One module of this thesis is based on user survey methodology, while the other is centered around applying deep learning techniques to analyze the dynamics of congestion. Several facets of congestion prediction are covered in this thesis's objectives. Through observation and feedback, it seeks to understand the causes that contribute to congestion and identify its characteristics from the perspective of travelers. To achieve this goal, we analyzed data from thirteen separate sites along the undivided interurban highway that were selected based on their mixed land use pattern and where congestion is an issue. The survey questionnaire yielded 206 responses, which constitute the data set. The Mann Whitney U test was used to see if there is any heterogeneity between the two traveler groups, drivers and passengers. This goal assesses the features of congestion using the Relative Impoftancc Index (RII) and also scores their preference for measures to alleviate it, in addition to identifying the attributes of congestion. Additionally, it suggests legislative recommendations based on the opinions of travelers in an effort to enhance their traveling experience on interurban highways. The second aim of the thesis uses exploratory factor analysis on the characteristics derived from the first objective and empirical knowledge to investigate the interrelationships among the congestion factors. This objectivc has sampled data from 282 replies and 5 experts. The professionals at Thcsc come from a variety of backgrounds in the transportation industry. After conducting an exploratory factual analysis on the traffic congestion attributes identified in objective one using interpretive knowledge based on the experiments, the fiuzy based total interpretive structural modeling technique (Fwzy TISM) was used to model the interrelationships between the factors. This technique provides both direct and indirect links of cause and effect between the factors. Micmac analysis has been used to determine the independent, dependent, linkage, and autonomous components in the connection diagram that was derived via Fuzzy TISM. Eight important factors contributing to traffic jams on interurban highways were identified through exploratory factors analysis. These factors include inadequate road geometry (Cl), environmental factors (C2), external events (C3), inefficient public transportation (C5), driving behavior (C6), special events (C7), and regional economic dynamics (C8). Using the r- uzzy TISM technique, we may partition the variables into different levels of the hierarchical connection diagram. At the Level I factor level, we have C3, C4, and C6. The second Lcvel component is C5, the third is Cl and CT, and the fourth is C2 and C8. Based on the results of the fuzzy Micmac analysis, we can deduce that factors such as Regional Economic Dynamics (C8), Special Events (C7), and Environmental Factors (C2) are independent variables that impact most of the factors, despite having less driving power. The occurrence of these variables also affects the linkage factor. In this study, archaic traffic management (C4) is a linkage factor that has a high driving power and a high dependence power owing to its unstable proper. It impacts and is affected by other factors such as road geometry dynamics (C3), logistical events (C4), inefficient public transportation facilities (C5), and driving behavior (C6). This thesis's first module is complete with the first two objectives. Objectives three and four are located in the second root of the thcsis plant. The third objective of the project is to automate the process of traffic volume determination utilizing modern dccp lcarning dctcctors and trackers. This would enable the collecting of real-time data for congestion prediction. This objective has been tested on a bespoke dataset consisting of 10,000 virtual reality images from India. The photos in this dataset depict a variety of highway situations with several automobiles. A model for vehicle categorization and traffic volume counting has been developed. It has been trained using YOI-O variants, a single stage object identification methodology, and Byte trackcr, a counting method. The results of the traffic volume counter are examined on the intersection scenes and mid blocks of the intcr urban highways on videos ranging from one minute to five minutes. The short videos achieve an accuracy of 98 degrees of freedom, while the long videos reach an accuracy of up to 87 degrees of freedom. In order to build a comprehensive knowledge base for predicting congestion states, the fourth objective of the study suggests a framework for a knowledge base prediction system regarding road congestion. This framework integrates the results of objectives one and three by combining the impedance factors identified in objective one with traffic flow variables obtained from objective three. The 84-hour data sample includes both weekdays and weekends, and this target has been executed on that basis. The ANFIS (Adaptive Network based fuzzy inferencc system) technique, which is based on several membership functions, has been used to analyze and generate the if-then rules for decision makers. It demonstrates that the existence of rnultiple impedance factors is responsible for the differences in congestion levels on weekdays and weekends. Finally, to improve the efficacy and precision of congestion prediction, the proposed knowledge-based basc road traffic system makes use of both uscr-drivcn insights and cxpert knowlcdgc. By combining user feedback with expert analysis and automated data collection techniques, the system provides a comprehensive view of traffic congestion and how to predict it. In the Indian context, where varied traffic situations and intricate urban landscapes call for precise prediction systems, this comprehensive methodology is of utmost importance. The system is able to produce strong and trustworthy predictions because it uses a combination of fuzzy logic and deep learning techniques, which allow it to deal with the inherent uncertainty and complexity of real-world traffic scenarios. Additionally, the system may enhance its predictive capabilities through the incorporation of user feedback and cxpcrt insights, guaranteeing its relevance and efficacy in ever-changing traffic scenarios. Finally, as a solution to the widespread issue of traffic congestion in India, the proposed knowledge-based road traffic congestion prediction system brings hope. The system offers a comprehensive framework for understanding, analysing, and forecasting congestion lcvcls by combining USCr perspectives, cxpcrt knowledge, and sophisticated technology. Improving traffic management strategies, reducing congestion, and making urban transportation systems in India more efficient might all be possible with the use of this system.

Published by:
Uncategorized

Analyzing Music Streaming Patterns Through Business Intelligence: A Power BI-Based Study Of Spotify Data

Authors: Dr. Suhas Mache, Bhagyashree Ingale

Abstract: Globally, music streaming services have completely changed how people listen to music. This study uses Power BI- based data analysis to investigate Spotify user behavior and listening habits. Using dynamic dashboards and visual aids, the study looks at regional preferences, artist performance, track patterns, and genre popularity. With the goal of assisting artists, marketers, and producers in making defensible decisions based on streaming data, business intelligence approaches provide actionable insights.

DOI: https://doi.org/10.5281/zenodo.15836070

 

Published by:
Uncategorized

Cfd Simulation of an Industrial Dust Cyclone Separator on The Influence of Inlet Velocity and Particle Size Performance

Authors: Dyaga Suresh,

Abstract: – Cyclone separators are widely used in industrial applications to remove particulate matter from gaseous streams. The performance of a cyclone is highly dependent on operational parameters such as inlet velocity and particle size. In this study, Computational Fluid Dynamics (CFD) is utilized to simulate and evaluate the performance of an industrial cyclone separator. Various inlet velocities (10 m/s to 25 m/s) and particle diameters (1 µm to 50 µm) were analyzed to determine their effects on separation efficiency and pressure drop. The simulations reveal that both inlet velocity and particle size significantly influence cyclone performance. This work helps optimize cyclone design parameters to improve dust removal efficiency while minimizing energy losses.

 

 

 

Published by:
Uncategorized

Analysis and Investigation towards Performance Enhancement of Traffic Light Control System

Authors: K.Ganesh, Assistant Professor K.Abhiram

Abstract: With these new tools, we can only speculate as to the level of intelligence that the next generation will achieve by modifying current methods. More and more, society benefits from increasingly sophisticated computing techniques. This article explores and discusses the smart road traffic system, which is one of the advantages. Using these state-of-the-art computational approaches will undoubtedly make the control system for road traffic lights easier and more efficient in the future. In order to optimize the overall trip time, the road traffic system takes into account several parameters, such as total travel time, wait time or delay time in the traffic line, vehicle speed, etc. There are a few issues with the road traffic flow, such as drivers not obeying the regulations of the road, accidents, and congestion. The real issue with traffic congestion is the lack of a better model for managing traffic on roads, which necessitates the dedication of researchers to finding a solution. The current traffic light control system has a few limitations, such as not being able to adjust the delay time based on the density of traffic, not being able to prioritize lane flow when emergency vehicles like fire trucks and ambulances are on the road, and so on. These studies aim to find solutions to the aforementioned traffic issues and, using the aforementioned factors, to optimize the overall travel time in a way that causes minimal delay or congestion. To address these issues, this study employs a probabilistic traffic queue analytical model to find a computationally efficient traffic model. An improved traffic light management system is developed by simulating a model that takes vehicle categorization into account and tests it with various traffic conditions. The model accounts for a variable time delay.

Published by:
× How can I help you?