Category Archives: Uncategorized

Assessing The Impact Of COVID-19 On Renewable Energy Project Finance In The US: Challenges And Opportunities

Uncategorized

Authors: Funmilayo Fenwa

Abstract: The COVID-19 pandemic represents an unprecedented global crisis that has fundamentally altered economic landscapes across all sectors, with particular significance for renewable energy project finance in the United States. This study examines the multifaceted impacts of the pandemic on renewable energy investment patterns, policy responses, and market dynamics during 2020. Through comprehensive analysis of industry data, policy documents, and market indicators, this research reveals a complex narrative of resilience and vulnerability within the renewable energy finance sector. While overall renewable capacity additions nearly doubled in the first half of 2020, driven primarily by tax credit deadline pressures, total renewable energy investment declined by 20% to $49.3 billion. The pandemic exposed critical dependencies on supply chains, policy incentives, and financing mechanisms while simultaneously demonstrating the sector's inherent stability advantages. This analysis contributes to understanding crisis resilience in clean energy markets and provides insights for policy development in future emergency scenarios.

DOI: http://doi.org/10.5281/zenodo.16994016

Published by:

Data Warehouse Modernization for Insurance: Integrating AI and Cloud Technologies

Uncategorized

Data Warehouse Modernization for Insurance: Integrating AI and Cloud Technologies
Authors:- Srinivasa Chakravarthy Seethala

Abstract- The insurance sector faces mounting challenges from regulatory changes, competitive pressures, and the demand for real-time data insights. Traditional data warehouses, essential for data storage and retrieval, often lack the flexibility, speed, and scalability required by modern insurance operations. This article examines how integrating Artificial Intelligence (AI) and cloud technologies can drive data warehouse modernization for insurers, delivering real-time decision-making capabilities, optimized data management, and enhanced operational efficiency. We explore methodologies, technologies, and case studies that demonstrate the transformative impact of AI and cloud in modernizing legacy data warehouses in the insurance sector.

DOI: 10.61137/ijsret.vol.10.issue5.794

Published by:

IJSRET Volume 7 Issue 6, Nov-Dec-2021

Uncategorized

Performance Analysis of Hybrid Solar Dryer (Review Paper)
Authors:- M. Tech. Scholar Manoj Kumar Mishra, Prof. C S Koli, Prof. Amit Agrawal

Abstract-Solar crop drying is an inexpensive and effective way to preserve food ingredients, especially in developing countries where fuel and electricity are expensive or unavailable. Some tropical fruits are difficult to transport and store, which increases their chances of spoilage. Without access to fuel and large drying systems, preserving fruit for later use is challenging or not possible for the rural farmer. Developing a low-cost, easily assembled locally and low-maintenance fruit drying system will improve access to the off-season and distant markets. A mathematical model of a hybrid solar drying system was developed and validated through experimental testing to design and optimize drying systems for use in developing countries. The prototype drying system consisted of a transpired solar absorber, drying chamber and blower. In this research paper, we are reviewing various solar dryers.

DVR to Mitigate Power Quality and Reduce the Harmonics Distortion of Sensitive Load
Authors:- Suhail Rafiq , Priya Sharma

Abstract- Power quality has been an issue that is becoming increasingly pivotal in modern industrial and commercial applications. Voltage disturbances especially the voltage sag and swell are the most common power quality problems due to increased use of a large numbers of sophisticated and sensitive electronic equipment in industrial systems. To overcome this problem, custom power devices are used. One of the devices is the Dynamic Voltage Restorer (DVR), which is the most efficient and effective modern custom power device used in power distribution networks. It is a series connected power electronic based device that can quickly mitigate the voltage sags in the system and restore the load voltage to the pre-fault value. The primary advantage of the DVR is keeping the users always on-line with high quality constant voltage maintaining the continuity of production. In this paper, a PI controller and a fuzzy logic controller method for DVR that protects a sensitive load, to counter voltage sag under unbalanced loading conditions (linear and non-linear) is presented. DVR along with other parts of the distribution system are simulated using MATLAB/ SIMULINK.

Applications of Green Chemistry in Daily Life: A Review
Authors:- Dr. Pushpraj Singh

Abstract- Green chemistry, also called sustainable chemistry, is an area of chemistry and chemical engineering focused on the design of products and processes that minimize or eliminate the use and generation of hazardous substances. It is a modern science that deals with the application of environmental friendly chemical compounds and materials in the various areas of our life such as industrial uses and many others. The beginning of green chemistry is considered as a response to the need to reduce the damage of the environment by man-made materials and the processes used to produce them. Green chemistry could include anything from reducing waste to even disposing of waste in the correct manner. Chemistry plays a pivotal role in determining the quality of life. The chemicals industry and other related industries supply us a huge variety of essential products, from plastics to pharmaceuticals. However, these industries have the potential to seriously damage our environment. Green chemistry therefore serves to promote the design and efficient use of environmentally benign chemicals and chemical processes. All these points will be discussed in this article.

Improving Productivity in a Duct Fabricating Industry Using Industrial Engineering Tools and Techniques
Authors:- M. Tech. Scholar Shradha Agnihotri, Prof. Hari Mohan Soni

Abstract- Productivity improvement is one of the core strategies closer to production excellence and it is also necessary to achieve correct monetary and operational performance. It enhances client delight and decrease time and cost to expand, produce and deliver merchandise and carrier. Productivity has a positive and big relationship to performance measurement for manner utilization, system output, product costs, and work-in-process stock tiers and on-time shipping. development can be in the form of elimination, correction (restore) of ineffective processing, simplifying the procedure, optimizing the machine, decreasing variant, maximizing throughput, reducing fee, enhancing high-quality or responsiveness and lowering set-up time.

Biomedical Applications of Polysaccharides
Authors:- Dr. Pushpraj Singh

Abstract- Polysaccharides are the most complex carbohydrates, composed of monosaccharide units bound together by glycosidic linkages. They are obtained from renewable sources such as plants, algae and microorganisms like fungi and bacteria. They are getting more attention because they exhibit a wide range of biological activities, such as anti-tumor, immunomodulatory, antimicrobial, antioxidant, anticoagulant, antidiabetic, antiviral, and hypoglycemia activities, making them one of the most promising candidates in biomedical and pharmaceutical fields. In this review, we will give insight into the most recent updated applications of polysaccharides.

Adaptive Video Streaming over HTTP Through Wireless High Speed Network
Authors:- M. Tech. Scholar Monu Kumar Saini, Assistant Professor Mr. Rahul Pandey

Abstract- “The adaptive bitrate streaming works on the principle of rapidly changing bitrate based on available resources, which is usually a bandwidth, resource limit and user query. The selection of bitrates is controlled usually by client and not by server. The adaptive HTTP video streaming is similar to progressive download streaming. In adaptive HTTP video streaming, the multimedia segments are split into same length segments. On other hand, the segments are split into video GoP that initiates with key frame. The initiation of key frame in video segments represents the past or future dependencies. Finally, the entire video segments are encoded and it is then hosted over a HTTP streaming server. Recently, the online video streaming traffic is growing massively. It is estimated that traffic in online video streaming services accounts for more than 80% of traffic in internet by 2020. Hence, the design of next generation video streaming services is driven by internet traffic.

Survey Paper on Security against Dynamic Routing in Wireless Sensor Networks and Internet
Authors:- M. Tech. Scholar Md Abid Hussain, Assistant Professor Mr. Manish Sahu

Abstract- Wireless sensor networks (WSNs) have been identified as one of the most innovative application of technologies in the 21stcentury. The small sized, low cost and low power sensors nodes are designed to communicate within a short range and work together to form a sensor network to gather data from a field.Cluster-based routing protocol is well known for enhancing the lifetime in WSN. Low Energy Adaptive Cluster Hierarchy (LEACH) protocol is one of the foremost and prominent protocols in the category of hierarchical protocols that enhanced the network lifetime by reducing and distributing the energy consumption among the nodes in a network. In this paper is studied of wireless sensor network based on fuzzy system. To improvement is lifetime of network with the help of fuzzy system.

Survey Survey Paper on Performance Evaluation of 5G System using Filter Bank Multicarrier Technique
Authors:- M. Tech. Scholar Roshni Patle, Associate Professor Mrs. Deepti Agrawal

Abstract- In this paper, the review of the multiple input multiple output using space time block code on IEEE 802.16 system. The Worldwide Interoperability for Microwaves Access technology which can offer high speed voice, image, and video and data service up to base on standard 802-16 wireless MAN is configured in the same way as a traditional cellular network. The range of WiMAX makes the system very attractive for users, but there will be slightly higher BER at low SNR. In this paper the study of different types of 4G and 5G technique and explain the advantage and disadvantage of the system.

Fatigue Analysis of Welded Joint Using Ansys: A Review Study
Authors:- M.Tech. Scholar Mohammad Imran, Prof. Dr. Ram Gopal Verma

Abstract- Fatigue failures in welded structures result in fatalities and significant financial losses. The adoption of different standards and fatigue design rules has been proposed as a solution to this problem. The foundations of such codes, in some instances, are based on outdated ideas that are difficult to convert to the output of current computer programmes and are also restricted to relatively simple structures.The purpose of this article is to investigate the fatigue strength of welded joints using a fracture mechanics method that takes into consideration welded joint fatigue behaviour. The technique assists in determining the fatigue crack propagation rate as a function of the difference between the applied driving force and the material crack propagation threshold, which is a function of fracture length. Failure of welded structures or machine components results in a variety of direct losses, including the cost of repair, the cost of effort to prevent future failure, accident compensation, and a reduction in output. Because joints are the weakest component of any building or machine, they are likely to fail first. As a result, it’s critical to investigate why certain joints are failing. Understanding the causes of failure and how they spread can help you appreciate welded joints from a reliability standpoint. Some purpose activities or failure causes may be essential, and they may be reduced at the layout or manufacturing stage, resulting in combined failure minimization.

A Review on Quality Improvement in Shaft Manufacturing Industry Usingseven Quality Tools
Authors:- M. Tech. Scholar Anand Kumar, Prof. Hari Mohan Soni

Abstract-In today highly competitive scenario the markets are becoming global and economic conditions are changing fast. Customers are more quality conscious and demand for high quality product at competitive prices with product variety and reduced lead time. It is a data-driven quality strategy used to improve processes. It is an integral part of a Six Sigma initiative, but in general can be implemented as a standalone quality improvement procedure or as part of other process improvement initiatives such as lean.Any enterprises that cannot manage the quality of its methods and products have a tendency to fall apart. Quality is crucial to sales, price control, productivity, risk control and compliance. As essential as quality is, there’s little agreement as to its definition. Therefore, in this study, seven quality tools have been reviewed.

A Review on Thermal Performance Analysis of Single Effect Vapour Absorption System
Authors:- M. Tech. Scholar Sanjay Sharma, Prof. Deenoo Pawar

Abstract- Cooling and refrigeration demand accounts for a significant portion of global energy consumption. Alternative cooling systems, including absorption and adsorption cooling systems, received more attention than before because mechanical vapour compression systems demand high-grade energy to operate. Conventional cooling methods outperform absorption and adsorption cooling systems it terms of overall performance. Today’s world is confronted with two major environmental issues. The energy problem and the greenhouse impact are indeed the two issues at hand. Scientists were attempting to find a solution to this issue. That fact underpins the majority of today’s inventions. The lithium-bromide and water-driven absorption refrigeration cycle is indeed an excellent illustration of just this idea since it often reduces fossil fuel use and hence CO2 emissions. Still, it also makes use of the low-grade heat of various businesses and data centers. Accordingly, in this paper, a review on thermal performance analysis of single effect vapour absorption system has been done.

Study and Analysis of Effect of Online Education Due to Covid-19 on Student Performance and Evaluation
Authors:-PG Scholar S. Shravya Geethika, PG Scholar Dy. Kiran Kumar

Abstract- It was the goal of this study to conduct an online survey to get information about the experiences of teachers and students in online classes. In light of the present pandemic crisis, the Indian school system has recently implemented an internet delivery method for classes. As a result of COVID19, online lessons have been made mandatory for college and university teachers and students. This poll, therefore, provides an overview of their impressions and concerns. More than 400 students from local colleges and universities were included in the study’s sample. The data was gathered using an online survey. For both teachers and students, the following areas are critical: high-quality and timely interaction between the professor and student, availability of technological support, structured online class modules and adjustments to fit practical classes.

Smart Home Energy Management System with Hybrid Energy Supply System
Authors:-M. Tech. Scholar Pratik Kumar Sharma

Abstract- The constant rise in household energy demand necessitates an effective energy management strategy. The development of smart houses that build interaction between users and their home appliances, as well as the adoption of hydride energy, has resulted in an effective home energy management system (HEMS) idea that operates automatically, multi-functionally, adaptably, and efficiently. Household applicants are encouraged to utilize in-home energy management devices by utility companies. The utility’s principal purpose is to lower peak load demand on the system, while the consumer wants to save money on their electricity bills. Renewable energy sources (RESs), a backup battery storage system (BBSS), and optimal power-sharing mechanisms can all improve the benefits of HEMS. Typically, the residential customer deals with a plethora of smart home gadgets, each of which has a different working time priority based on the consumer’s preferences. In this research, a cost-effective power-sharing technique based on power availability is created using an effective home energy management system idea and a hybrid energy storage system. The BBSS is charged and discharged using energy from grid communities, real-time energy pricing, and renewable energy sources.

Blockchain in Healthcare
Authors:- Yash Dave, Ansh Dalwadi

Abstract- The fourth industrial revolution, which will alter the globe, is commonly referred to as Blockchain technology. Blockchain technology provides a decentralized, distributed, and central authority-free environment. Since Bitcoin launched Blockchain, research has been continuing on non-financial use cases to extend their applicability. Healthcare is an industry with a significant influence on the Blockchain. Healthcare has penetrated the enthusiasm for the changing nature of Blockchain technology. Blockchain is frequently viewed as the most necessary and optimal healthcare technology to handle sophisticated and complex security and interoperability concerns. More significantly, the “value” and trust-based system’s smart contract mechanism can offer automatic action and reaction. Healthcare, on the other hand, is a complex system. In this paper, we introduce the blockchain and its properties, as well as the significance of the blockchain in healthcare. It also provides blockchain administration, adjudication of claims, interoperability, and application. While in several situations, we observed blockchain technology, the use of blockchain in health care was highly addressed in this paper and the reason why blockchain should be utilized. We introduce the advantages of blockchain as well. Furthermore, we examined the difficulties and prospects for the future and how they may be implemented in more healthcare industries. The paper also discusses the current level of Blockchain application development for healthcare and its limits and topics for further research. This paper aims to demonstrate how Blockchain technologies may be utilized in healthcare and what problems this technology may face in the future and what the Blockchain’s prospects are.

A Study on Consumer Behaviour towards Big Bazaar in Kozhikode
Authors:- Dr.S. Elango Ph.D, Dr.P.Gowthaman Ph.D

Abstract- The study on consumer behaviour towards big bazzar in Kozhikode was carried out in with an objective of knowing satisfaction level of customer at Big Bazaar and do customers are aware about the different types of products and Services and different offers provide at Big Bazaar. The total sample size taken was seventy five (75) from various customers of Calicut at Big Bazaar. The research shows that the customer satisfaction at Big Bazaar is very good. Many customers are not aware of the product and services provided by the Big Bazaar which are not provided by other Retail stores. On the other hand they have also the existing customers of Big Bazaar who are satisfied with the working style of retail store and customer support systems t big bazaar. They want that Big Bazaar should do promotional activity as – Advertising in social Medias so that they can attract more customers. The researcher used questionnaire method to carry out the study.

A Review Article of Power Load Flow Analysis for Active Islanding Mode
Authors:- Keshav Kumar,Dr. Anil Kumar Kori

Abstract- Power flow studies are very important in the planning or expansion of power system. With the integration of distributed generation (DG), micro-grids are becoming attractive. So, it is important to study the power flow of micro-grids. In grid connected mode, the power flow of the system can be solved in a conventional manner. In islanded mode, the conventional method (like Gauss Seidel) cannot be applied to solve power flow analysis. Hence some modifications are required to implement the conventional Gauss Seidel method to islanded micro-grids.

Dermoscopic Image Classification using Resnet 50 in MATLAB
Authors:- V. Rajmohan

Abstract- Now a days, cancer is one of the most complex disease for diagnostics. It means cells in the body grow out of control. Due to skin exposes in sun, it may cause the abnormal growth of skin cells level in a human body. Among the skin cancer, it has some types such as basal, squamous cell carcinoma and melanoma among those skin cancer types melanoma is can’t able to predict for dermatologist. If we detect the melanoma on earlier stage, it’s easy to cure it. Computer vision and Image processing toolboxes plays a pivotal portion in medical imaging and its diagnosis field and also it’s already proved on several methods. In our work, we represent the computer aided manner for skin cancer detection (i. e melanoma) using MATLAB-Image Processing toolbox. The input dermoscopic skin cancer image is used in the system, further applied to the system using new schemes. By using the image analysis tool segment the skin cancer region, its features are extracted. Based on features will be applied to classifier, it will predict the skin cancer segmented region and it’s belong to either melanoma or not melanoma type.

Technical and Economic Feasibility of Using Steel Fibers Reinforced Concrete (SFRC) in Slap line of Subways and Rail Roads
Authors:- Seyedmohammad Fatemi

Abstract- Utilization of steel fiber in railroad is under consideration in some countries around the world. The purposes of using these materials are, increase of freightage and crack control in the slap of the subway and railroad line. Fibers are commercially available and manufactured from steel, plastic, glass, and other natural materials. In many situations it is prudent to combine fiber reinforcement with conventional steel reinforcement to improve performance. The model of concrete can be used for analysis of failure mechanism of reinforced concrete structural elements. Although most of the current railway tracks are still of a traditional ballasted type and recent applications tend more and more towards non-ballasted track. Slab track designs have significant advantages comparing to ballasted tracks. The most significant are the high stability of the track. Their disadvantages against the ballasted tracks are mainly summarized in their higher construction costs. In order to select the most suitable concrete for the construction of high-rise buildings, method of analytic hierarchy process based on expert knowledge has been used. In this study conducted a series of laboratory works, to compare the effect of steel fibers used in various categories of resistance on concrete behavior parameters. Mixing the samples is set for the three categories of resistance 25, 35 and 45 MPa. Strength parameters that are chosen to identify concrete actions are tensile strength, impact strength, compressive strength, and flexural strength. Also, the samples in each resistance category are made with four fibers quantity: without fibers, 15, 25 and 35 kg fibers per cubic meter. The results suggest that using of steel fibers, increases the impact resistance, time of the first crack and ultimate strength of concrete significantly. Also, the addition of this type of fibers, increases tensile strength and flexural strength but doesn’t have significant effect on the compressive strength of concrete

Performance Analysis of Shell and Tube Heat Exchanger: Parametric Study
Authors:- M. Tech. Scholar Manu Mishra, Prof. Maneesh Dubey

Abstract- Refrigeration, air conditioning, and chemical plants all use heat exchangers. It’s utilised for a variety of things, including transferring heat from a hot to a cold fluid. They’re commonly employed in a variety of industrial settings.Researchers had worked on a variety of projects in attempt to increase performanceIn this study shell and tube heat exchanger with 10 different baffles are placed along the shell in alternating orientations with cut facing up, cut facing down, etc., in order to create flow paths across tube bundle. The geometric modelling is done using CAD software called CATIA V5R21 because it is easy to model Heat exchanger in 3D modelling software.

Design & Thermal Analysis of Double Pipe Heat Exchanger by Changing the Mass Flow Rate
Authors:- M. Tech. Scholar Rahul Vishwakarma, Prof. Maneesh Dubey

Abstract- The project is based on Voice Personal Assistant (VPA) it is a digital assistant that uses voice recognition, natural language processing and speech synthesis to provide aid to users through voice recognition applications. One of the most studied and popular was the direction of interaction, based on the understanding of the machine by the machine of the natural human language. It is no longer a human who learns to communicate with a machine, but a machine learns to communicate with a human, exploring his actions, habits, behavior and trying to become his personalized assistant.

A Review on Design & Development of a Solar Pond and CFD Modeling
Authors:- M. Tech. Scholar Ravil Khateek, Prof. Maneesh Dubey

Abstract- Evaporation condensation process for converting saline water to fresh water is the widespread and the oldest technology used for desalination. In this process, the saline water is heated using a heat source to the evaporating point (here the solar pond) and thus this steam that is evaporated leaves behind the salt content present in the saline water. The evaporated fresh steam is then collected and then condensed to give us the fresh water. This fresh water is then again distilled to lose the remaining ppm thus making it drinkable. This technology is cheaper than the others since it doesn’t have any other energy costing elements other than the heat source and condenser. Thus, by using a solar pond as the heat source we are reducing the cost of the desalination process. Accordingly, a review on design & development of a solar pond and CFD modeling has been done.

Performance Evaluation of Geosynthetics Geotextile Based Pavement Using CBR Test
Authors:- Lecturer Ritu Mewade

Abstract- Two design methods were used to quantify the improvements of using geotextiles in pavements. In this study, a comprehensive life cycle cost analysis framework was developed and used to quantify the initial and the future cost of 25 representative low volume road design alternatives. A 50 year analysis cycle was used to compute the cost-effectiveness ratio when geotextiled is used for the design methods. The effects of three flexible pavement design parameters were evaluated; and their impact on the CBR results was investigated.

Exploratory Research Using Bacteria as a Self-Healing Concrete: Review
Authors:- Research Scholar Teena Chandnani, Associate Professor Dr. N.K. Dhapekar

Abstract- This study illustrates that the utilization of microorganisms-Bacillus Subtlis is productive for development of a tough framework and put forth a concentrated effort mending concrete as strategy for break control to upgrade administration life in solid structure. Crack formation is very common occurrence in concrete structure which allows the water and different type of chemical into the concrete through the cracks and reduces their durability, strength and which also affect the reinforcement when it comes in contact with water, carbon-di-oxide and other chemicals. It is expensive to maintain or repair concrete-based structures every now and again. For resolving this issue self-healing concrete mechanism is introduced in the concrete which helps to repair the cracks by producing calcium carbonate crystals which close up the micro cracks and pores in the concrete The investigation illustrates that there is a remarkable increase in the quality of cement added with bacteria or bacterial concrete contrasted with conventional concrete.

Voice Assistant
Authors:- Ritik Porwal, Ujjawal Tomar, Vishakha Dubey, Asst. Prof. Akshita Mishra, Asst. Prof. Gourav Mandloi

Abstract- The project is based on Voice Personal Assistant (VPA) it is a digital assistant that uses voice recognition, natural language processing and speech synthesis to provide aid to users through voice recognition applications. One of the most studied and popular was the direction of interaction, based on the understanding of the machine by the machine of the natural human language. It is no longer a human who learns to communicate with a machine, but a machine learns to communicate with a human, exploring his actions, habits, behavior and trying to become his personalized assistant.

Digital Badging Platform
Authors:- Shubham Singla, Asst. Prof. Ajay Kaushik

Abstract- During current days if we saw due to Covid most of things specially if we talk about education, it becomes online and students & teachers both found it difficult to cop up with this situation. It’s a sudden change and no one expected that education would become online and no-one is prepared for the same and now we are facing some real issues. Considering student’s cases, they lose the motivation to study, there is no competition among peers, classes seem boring to them and they start skipping classes. If we mention the teacher’s case, then they won’t ready to pay much attention to every student, they still don’t know where they’re lacking and neither they’re ready to deliver the proper system to them. After analyzing this problem, it seems quite big, it becomes important to unravel it. Introducing you to the Digital badges which are truly becoming an appropriate, easy and efficient way for educators, community groups and other professional organizations, to exhibit and reward participants for skills obtained in any professional development or formal and informal learning. I have proposed a web- portal based solution for the above problems, which aims to solve the problem of motivation among students and also helps teacher to know their students in a perfect way.

A Survey on Wireless Network Optimization by Attack Prevention and Detection Techniques
Authors:- Lipi Sharma, Dr.Vivek Richariya

Abstract-Wireless sensor network are acting as a important portion for implementation, maintains of many application and services. Open network for communication increases its flexibility and vulnerability of attacks as well. It is critical challenge to develop the effective and lightweight security mechanism to detect and prevent various attacks for WSN Attacks were list in the paper and classified as per nature of the activity performed by malicious nodes. This paper has summarized energy dependency of the WSN, paper has list some of techniques to expand life of the WSN network. Many of researcher has proposed different techniques of network attack detection were detailed in the paper. Some of energy optimization papers were also introduced to increase the life span of network.

Bus Notification System with Alarm
Authors:- T. Jahnavi Lakshmi, K. Sri Lakshmi, G. Naga Sai Teja, J. Harika, Associate Prof. G. Kalyani

Abstract- Now a day’s, an individual can’t determine which bus is coming to the bus depot. They’re always reliant on somebody to assist them out on an equivalent issue. So we’ve come up with a project which might allow the person to work out the buses which are coming at the bus depot and also give them information about the route through which they’re going. During this system, there’s a requirement to put a GPS device on all the buses. This can help the person to work out which bus is coming to the bus depot and what its route alongside the stops is without counting on anybody.

Processing of M2 High Speed Steel Powder Using Waste Rubber Based BinderBy Metal Injection Moulding Technique
Authors:- Mohd Afian Omar, Norhaslina Johari

Abstract- This paper presents the compatiblity of waste rubber as one of the binder components in producing powder injection moulding feedstock. M2 High Speed Steel powder with mean diameter particle size 16µm has been used for this study. The metal powder with the powder loading of 65 vol.% were mixed with paraffin wax, polyethylene, waste rubber and stearic acid using z-blade mixer for two hours at 160°C in order to produce the feedstock. The feedstock was injection moulded by using vertical injection moulding machine with a nozzle temperature of 180°C and pressure of 350Bar. The moulded part was immersed into n-heptane at 60°C for five hours in order to remove the paraffin wax and stearic acid. The specimens were then sintered at a temperature range of 1200°C-1260°C in a controlled vacuum atmosphere. The properties of the sintered specimen such as physical, mechanical and microstructure were studied and discussed.

Artificial Intelligence in Heart Disease Treatment
Authors:- Research Scholar Anirban Chakraborty

Abstract- Cardiovascular disease (CVD), despite the significant advances in the diagnosis and treatments, still represents the leading cause of morbidity and mortality worldwide. In order to improve and optimize CVD outcomes, artificial intelligence techniques have the potential to radically change the way we practice cardiology, especially in imaging, offering us novel tools to interpret data and make clinical decisions. AI techniques such as machine learning and deep learning can also improve medical knowledge due to the increase of the volume and complexity of the data, unlocking clinically relevant information. Likewise, the use of emerging communication and information technologies is becoming pivotal to create a pervasive healthcare service through which elderly and chronic disease patients can receive medical care at their home, reducing hospitalizations and improving quality of life. The aim of this review is to describe the contemporary state of artificial intelligence and digital health applied to cardiovascular medicine as well as to provide physicians with their potential not only in cardiac imaging but most of all in clinical practice.

Strategic Framework for Managing Transformational Change towards Sustainability in Ethiopian Banking Industry
Authors:- Dr. Abreham Tesfaye Abebe

Abstract- The study aims at developing strategic framework for managing transformational change towards sustainability in Ethiopian banking industry. The study was guided by five critical research questions so that it can be aligned to the core points of the study. To make it representative, the researcher made an attempt to include three private commercial banks in Ethiopia that entered to the industry in various periods. The samples were taken from the selected banks, most importantly, the senior executive leadership, middle level management and senior experts in the area. Following the development of the framework using the environmental, social and economic dimensions of sustainability, it was validated with fifteen professionals who have over 20 years of work experience in Ethiopia banking industry. Questionnaires and interview methodologies were employed in the study and it is recommended as sustainability shall be understood in a more holistic perspective having the three dimensions (environmental, economic and social) in to consideration. Besides, continuous training shall be conducted on the concept of sustainability in relation to banking business, performance management in that regard shall also be conducted, and the Bank’s community shall clearly know that where can they contribute towards the management of change initiatives towards sustainability.

Impact of Aircraft Noise Generation on Residential Neighbourhoods near Port- Harcourt International Airport Omuagwa, Ikwerre Local Government Area, Rivers State, Nigeria
Authors:- Ogbonna VA, Ebubeoniso JJ, Weli VE

Abstract- This study examined Aircraft noise impact on residential neighborhoods near Port Harcourt International Airport Omuagwa, Nigeria. Medical health record data for the acoustic report was obtained from the Omuagwa health center. Field noise measurement was taken from sensitive noise receptors in the airport and the host community using a précised sound level meter model 2310 SL, IEC 651. Seven years’ acoustic health records from 2013, 2014, 2015, 2016, 2017, 2018, and 2019 were obtained and calculated. Field noise data was sample within and outside the Airport terminal. The points include terminal1(95.6 dBA), terminal 2(94.0 dBA), Airport junction (94.4 dBA), Igwuruta Stadium (88.0 dBA), Omuagwa Village1(92.7 dBA), Omuagwa Village (91.7 dBA), and Omuagwa village market (92.3dBA). The result shows two health diseases such as hypertension and hearing impairment caused by aircraft noise, with 2,990 cases. Hearing impairment is prevalent among adults above 18 years, with 2,248 disease cases and 742 cases of hypertension in the study area. The relationship between aircraft noise and residents’ health was determined using regression analysis, and the statistical result revealed a significant relationship. To mitigate the effect of noise on people residing around airports, the Aviation Industry should ensure noise metrics measurement are deployed to ascertain noise level of different aircrafts on daytime, nighttime, and frequencies of takeoff and landing within and outside the airport terminal. The Federal Ministry of Aviation should monitor and restrict the host communities’ land development encroachment near airport, to help curb the risk of noise pollution.

A Study on Consumer Attitude towards Online Shopping With Special Reference to Nilgiri District
Authors:- M. Mahesh Kumar, Sobha.P.G, Asst. Prof. N.Jemila Dani, Andrew J.R, Vanitha S., Cuneyt, K. Gautam B., Dr.S. Elango Ph.D, Dr.P.Gowthaman Ph.D

Abstract- In the present era of globalization electronic marketing is playing a great revolution. Over the last few decades most of the business organizations are running with adoption of technology and technological change. Online shopping or marketing is the use of technology (i.e., computer) for better marketing performance and reaching a large amount of customer’s in spite of the boundaries and territories. And retailers are devising strategies to meet the demand of online customers. They are busy in studying consumer taste, preference and behaviour in the field of online shopping and understand the consumer attitudes towards online shopping and making necessary changes in their strategies and plans. The aim of present study is also to know about the consumer’s attitudes towards online shopping and specifically studying the factors influencing consumers to shop online.

Prediction of Heart Disease Using Machine Learning Techniques
Authors:- M. Tech. Student Chetana Patil, Asst.Prof. Dr. Priti Subramanium, Head & Asst. Prof. Dr .Dinesh. D .Patil

Abstract- In the today’s world Heart disease prediction is a critical challenge. Machine learning (ML) is effective in making some predictions from the huge quantity of data created by the healthcare industry and decisions. The health care industry contains large amount of medical data, therefore machine learning algorithms are necessary to make decisions effectively in the prediction of heart diseases. Various hybrid Machine learning techniques are using in some recent developments in various areas of the Internet of Things (IoT). Introduction of prediction model is with different combinations of features and some known classification techniques. Data pre-processing uses various techniques like the removal of classification of attributes, missing data, noisy data and default values filling for decision making and prediction at different levels.

Digital Badging Platform
Authors:- Shubham Singla, Asst. Prof. Ajay Kaushik

Abstract- During current days if we saw due to Covid most of things specially if we talk about education, it becomes online and students & teachers both found it difficult to cop up with this situation. It’s a sudden change and no one expected that education would become online and no-one is prepared for the same and now we are facing some real issues. Considering student’s cases, they lose the motivation to study, there is no competition among peers, classes seem boring to them and they start skipping classes. If we mention the teacher’s case, then they won’t ready to pay much attention to every student, they still don’t know where they’re lacking and neither they’re ready to deliver the proper system to them. After analyzing this problem, it seems quite big, it becomes important to unravel it. Introducing you to the Digital badges which are truly becoming an appropriate, easy and efficient way for educators, community groups and other professional organizations, to exhibit and reward participants for skills obtained in any professional development or formal and informal learning. I have proposed a web- portal based solution for the above problems, which aims to solve the problem of motivation among students and also helps teacher to know their students in a perfect way.

The Effects on Rate of Change of Thermal Behavior of Auto Muffler with Thermo Electric Module
Authors:- M. Tech. Scholar Rajat Kumar Sahoo, Prof. C S Koli, Prof. Amit Agrawal

Abstract- Thermal quantity and external insulation to exhaust system is main factor which affects the inlet gas temperature of catalytic converters. Under normal operating conditions, catalytic converters are most effective to reduce air pollution from internal combustion engines. The exhaust gases flowing through the exhaust system need to be cooled before reaching the catalytic converter to increase performance of catalytic converter. The heat transfer analysis in automotive exhaust system is necessary because their importance in the design and optimization phases of exhaust aftertreatment system. Heat loss between the engine out and before the catalyst converter will determine the energy gain of the catalyst thus affect the temperature rise of the catalyst which affect catalyst life time. A significant number of researches have been done for exhaust manifold, exhaust piping and catalytic converter packaging design for automotive exhaust system to improve performance based on heat transfer analysis of exhaust system. The resulting heat transfer expression based on experiments and mathematical modeling used in computational models for the design of exhaust system parts and optimization phases to facilitate the selection of suitable material and designed system for better performance.

Thermal Performance Analysis Of Active Type Solar Dryer
Authors:-M.Tech .Scholar Bibhuti Bhusan Panda, Prof. C S Koli, Prof. Amit Agrawal

Abstract- Basiccrops drying by sunlight based vitality is of incredible financial significance the world over particularly in India where the greater part of the yields grain are lost to parasitic and microbial assaults Appropriate drying could without much of a stretch forestall these wastages which upgrades stockpiling of yields and grains over significant stretches India is honored with plentiful sunlight based vitality all the all year removal of moisture significant and most vitality expending forms in the food preparing concoction printing texture biting the dust ventures and so forth In rancher level drying is being benefited on open yards without in any way sterile conditions For the most part warm vitality kept up between 40oC to 25oC relying upon the items and creation strategies An ordinary fuel like power kindling diesel heater oil lamp fuel and so on is creating that vitality The target of this task is to adjust plan of a constrained convection roundabout sun based dryer and its exhibition test on The framework comprises of an air warming segment The sun oriented dryer comprises of various segments for example sun based board battery warming component and blower The blower is accustomed to passing the hot air to the necessary spot with the goal that the dampness substance in the spot was expelled It offers a superior authority over drying and the item got is of preferable quality over sun drying Sunlight based dryer can be worked at higher temperature suggested for profound layer drying.

The Importance of Artificial Intelligence on Recruitment Industry
Authors:- Asst. Prof. Roopa U, Savita Hosamani

Abstract-The talent procurement team defines the role talent acquisition team of the HR who oversee discovering, procuring, evaluating, and hiring candidates who fit the roles that are required to meet company goals and fulfill project requirements. Many challenges faced in the recruitment industry and for any organization to achieve its goals, a committed workforce is a must. The traditional method of recruitment was time consuming, cost ineffective. Through Artificial Intelligence (AI) we will figure out which are the factors to be concentrated more so that we can hire a skilled employee with less sourcing time and making it cost effective even to the organization as well.

Bus Notification System with Alarm
Authors:- T. Jahnavi Lakshmi, K. Sri Lakshmi, G. Naga Sai Teja, J. Harika, Associate Prof. G. Kalyani

Abstract- Now a day’s, an individual can’t determine which bus is coming to the bus depot. They’re always reliant on somebody to assist them out on an equivalent issue. So we’ve come up with a project which might allow the person to work out the buses which are coming at the bus depot and also give them information about the route through which they’re going. During this system, there’s a requirement to put a GPS device on all the buses. This can help the person to work out which bus is coming to the bus depot and what its route alongside the stops is without counting on anybody.

Comparative Study of springs of Almora and Model Generated Values for Selected Stations of Spring Fed River Kosi in Uttarakhand with Reference to Nitrate
Authors:- Research Associte Pooja Rani Sinha, Kireet Kumar Scientist G , V.P Uniyal Scientist G

Abstract-The current study is to comparative study the nitrate contamination of springs of Almora and the spring fed river at the selected stations of the stretch. The contaminations levels of Almora springs are ranged between 30mg/l-48mg/l which is above the permissible limit as suggested by BIS is 40-45mg/l whereas the contamination levels of the nitrate in the spring fed river Kosi ranged from 2mg/l-15mg/l. The current study focuses on the dilution of the springs and its contamination after these springs feed the river Kosi. The current study incorporates the nitrate contamination of the springs selected for study and the nitrate contaminations of the selected stations of the river Kosi.The statistical model WASP generated values of nitrate contamination for the river Kosi compares the simulated and observed values and signifies probable contamination level of nitrate in near future of the river.

Comparative Study of springs of Almora and Model Generated Values for Selected Stations of Spring Fed River Kosi in Uttarakhand with Reference to Nitrate
Authors:- Research Associte Pooja Rani Sinha, Kireet Kumar Scientist G , V.P Uniyal Scientist G

Abstract-The current study is to comparative study the nitrate contamination of springs of Almora and the spring fed river at the selected stations of the stretch. The contaminations levels of Almora springs are ranged between 30mg/l-48mg/l which is above the permissible limit as suggested by BIS is 40-45mg/l whereas the contamination levels of the nitrate in the spring fed river Kosi ranged from 2mg/l-15mg/l. The current study focuses on the dilution of the springs and its contamination after these springs feed the river Kosi. The current study incorporates the nitrate contamination of the springs selected for study and the nitrate contaminations of the selected stations of the river Kosi.The statistical model WASP generated values of nitrate contamination for the river Kosi compares the simulated and observed values and signifies probable contamination level of nitrate in near future of the river.

Prediction of Heart Disease Using Machine Learning Techniques
Authors:- M. Tech. Student Chetana Patil, Dr. Dinesh. D .Patil, Asst. Prof. Dr. Priti Subramanium

Abstract- In the today’s world Heart disease prediction is a critical challenge. Machine learning (ML) is effective in making some predictions from the huge quantity of data created by the healthcare industry and decisions. The health care industry contains large amount of medical data, therefore machine learning algorithms are necessary to make decisions effectively in the prediction of heart diseases. Various hybrid Machine learning techniques are using in some recent developments in various areas of the Internet of Things (IoT). Introduction of prediction model is with different combinations of features and some known classification techniques. Data pre-processing uses various techniques like the removal of classification of attributes, missing data, noisy data and default values filling for decision making and prediction at different levels.

Acceptance Criteria of Concrete as per IS: 456-2000, 4th Revision, Amendment No. 4
Authors:- Posannapeta Y Ganga Ram

Abstract- In the amendment no. 4 of IS:456-2000, the acceptance criteria of concrete slightly modified. It made very simple and common for M15 and above grade for concrete works. Still most of cases there are lot of confusions on acceptance and finalization of strength of the concreate.It is clearly mentioned in IS code, even though various interpretations on understanding the same. To overcome this issue here it is clearly described the acceptance and finalization of the concreate strength.In this paper, it is described based on IS: 456-2000, 4th revision, amendment no. 4.

A Review on Power Quality Based on UPFC
Authors:- Research Scholar Yamunadhari Kumar, Prof. Mamta Sood, Dr. Manju Gupta, Dr.Anuprita Mishra

Abstract- Power electronic controllers for a flexible ac transmission system (FACTS) can offer a greater control of power flow, secure loading and damping of power system oscillations. A unified power flow controller (UPFC) is a one of FACTS elements that can provide VAR compensation, line impedance control and phase angle shifting. The UPFC consist of two fully controlled inverters, series inverter is connected in series with the transmission line by series transformer, whereas parallel inverter is connected in parallel with the transmission line by parallel transformer. The real and reactive power flow in the transmission line can be controlled by changing the magnitude and phase angle of the injected voltage produced by the series inverter. The basic function of the parallel inverter is to supply the real power demanded by series inverter through the common dc link. The parallel inverter can also generate or absorb controllable reactive power. This paper offers and discusses most papers that used a UPFC to improving the active and reactive power flow of the power systems the unified power flow controller (UPFC) is an advanced member of flexible AC transmission systems (FACTS) group. This paper is focused on three techniques for inclusion of the steady state models of the UPFC in power flow programs. This paper also presents a review of various benefits and applications of UPFC in power flow studies such as minimization of loss, enhancement of load ability, voltage stability etc. using various optimization techniques. A case study is also shown to analysis effect of UPFC using comprehensive NR method based power flow.

A Review on Hybrid Renewable Energy System Using Dynamic Voltage Restorer
Authors:- Research Scholar Nilesh Kumar Choudhary, Asst. Prof. Amarnath Mukherjee,Asst. Prof. Parikshit Bajpai

Abstract- This paper presents a new system for integration of a grid-connected photovoltaic (PV) system together with a self-supported dynamic voltage restorer (DVR).Power quality (PQ) is gaining a great deal of importance as more sensitive loads are introduced into the utility grid. The degradation of product quality, damage of equipment and temporary shutdowns are the general issues associated with PQ problems in industries. Any mal-operation or damage of the industrial sensitive loads results in monetary losses disproportionately higher than the severity of the PQ issues. The evolution of power electronics technology replaced the traditional power quality mitigation methods with the introduction of Custom Power System devices (CUPS). The major power electronic controller based CUPS are DSTATCOM, DVR and UPQC. DVR is a pertinent solution for the economic losses caused by the PQ issues in the industries. Among the CUPS, DVR is the most cost-effective one. In the published literature, only a few papers correspond to the review of DVR technology. In this paper, a systematic review of published literature is conducted and a description is given on the design, standards and challenges in the DVR technology. In addition to the energy variability of renewable energy sources, random voltage sags, swells and disruptions are already a major issue in power systems. Recent advances in power electronic devices have provided a platform for new solutions to the voltage support problem in power systems.

Review Article of Heat exchanger performances comparison using Two variance ANOVA Method
Authors:- Kamlesh Ravat, Asst. Prof. Saumitra Kumar Sharma

Abstract- Heat exchangers are one of the most important heat transfer apparatus that find its use in industries like oil refining, chemical engineering, electric power generation etc. Shell-and-tube type of heat exchangers have been commonly and most effectively used in Industries over the years. In this paper we see a review of Outline and Types of Heat exchangers , Thermal Design and Mechanical Design by the use of ASME,TEMA standard take a case study of Modern Shell & Tube type Heat exchanger.

A Review Article Of Hv Relay Based Protection And Power Stabilization Using Scheduling Switching Time
Authors:- Neha yadav, Dr. A. K. Kori

Abstract- Due to the increasing demand of energy and the need for nonconventional energy sources, distributed generation (DG) has come into play. The trend of unidirectional power flow has been gradually shifting. With new technology comes new challenges, the introduction of DG into the conventional power system brings various challenges; one of the major challenges is system protection under DG sources. These sources pose a significant challenge due to bidirectional flows from DGs as well as lower fault current contribution from inverter interfaced DGs. This paper reviews existing protection schemes that have been suggested for active distribution networks. Most of these protection strategies apply only to smaller distribution systems implying that they may need to be extended to larger systems with a much higher penetration of distributed generation. In the end, a potential protection scheme has also been recommended as a future work.

A Review Article of Power System Fact Controller Implementation Using Deep Learning Techniques
Authors:- M. Tech. Scholar Princy Singh, Dr. A K Kori

Abstract- The recent advances in computing technologies and the increasing availability of large amounts of data in smart grids and smart cities are generating new research opportunities in the application of Machine Learning (ML) for improving the observability and efficiency of modern power grids. However, as the number and diversity of ML techniques increase, questions arise about their performance and applicability, and on the most suitable ML method depending on the specific application. Trying to answer these questions, this manuscript presents a systematic review of the state-of-the-art studies implementing ML techniques in the context of power systems, with a specific focus on the analysis of power flows, power quality, photovoltaic systems, intelligent transportation, and load forecasting. The survey investigates, for each of the selected topics, the most recent and promising ML techniques proposed by the literature, by highlighting their main characteristics and relevant results. The review revealed that, when compared to traditional approaches, ML algorithms can handle massive quantities of data with high dimensionality, by allowing the identification of hidden characteristics of (even) complex systems. In particular, even though very different techniques can be used for each application, hybrid models generally show better performances when compared to single ML-based models.

A Review Article Enhancement of Image Forgery and Improvement of Image Parameters Using DWT Algorithm
Authors:- Rajni Soni, Asst. Prof. Hemant Amhia

Abstract- This paper presents a new system for integration of a grid-connected photovoltaic (PV) system together with a self-supported dynamic voltage restorer (DVR).Power quality (PQ) is gaining a great deal of importance as more sensitive loads are introduced into the utility grid. The degradation of product quality, damage of equipment and temporary shutdowns are the general issues associated with PQ problems in industries. Any mal-operation or damage of the industrial sensitive loads results in monetary losses disproportionately higher than the severity of the PQ issues. The evolution of power electronics technology replaced the traditional power quality mitigation methods with the introduction of Custom Power System devices (CUPS). The major power electronic controller based CUPS are DSTATCOM, DVR and UPQC. DVR is a pertinent solution for the economic losses caused by the PQ issues in the industries. Among the CUPS, DVR is the most cost-effective one. In the published literature, only a few papers correspond to the review of DVR technology. In this paper, a systematic review of published literature is conducted and a description is given on the design, standards and challenges in the DVR technology. In addition to the energy variability of renewable energy sources, random voltage sags, swells and disruptions are already a major issue in power systems. Recent advances in power electronic devices have provided a platform for new solutions to the voltage support problem in power systems.

Review Article Of hybrid System – Framework Which Prompts Produce Power With Reasonable Expense Without Harming The Nature Balance
Authors:- Student of ME Shashi Bala Masram, Dr. Anil Kumar Kori, Asst. Prof. Pawan Kumar Pandey

Abstract- Due to the fact that solar and wind power is intermittent and unpredictable in nature, higher penetration of their types in existing power system could cause and create high technical challenges especially to weak grids or stand-alone systems without proper and enough storage capacity. By integrating the two renewable resources into an optimum combination, the impact of the variable nature of solar and wind resources can be partially resolved and the overall system becomes more reliable and economical to run. This paper provides a review of challenges and opportunities / solutions of hybrid solar PV and wind energy integration systems. Voltage and frequency fluctuation, and harmonics are major power quality issues for both grid-connected and stand-alone systems with bigger impact in case of weak grid. This can be resolved to a large extent by having proper design, advanced fast response control facilities, and good optimization of the hybrid systems. The paper gives a review of the main research work reported in the literature with regard to optimal sizing design, power electronics topologies and control. The paper presents a review of the state of the art of both grid-connected and stand-alone hybrid solar and wind systems.

Computer Aided Modeling and Simulation of Engine Block Fins with Various Fin Materials
Authors:- Research Scholar Anukaran Jalonha, Asst. Prof. Nilesh Sharma, Asst. Prof. Divyadarshani Dhakre

Abstract- Internal combustion engines are using the chemical energy of the fuel to convert it into the mechanical work. As the fuel is ignited in power stroke either it is spark ignition engine as in case of petrol engine or compression ignition engine as in case of diesel engine the power is produced and piston moves continuously between the top dead centre and bottom dead centre. In internal combustion engine this piston movements are very fast and as the engine speed increases it is even faster. This movement together with the fuel burning creates lot of heat inside the cylinder and ultimately results in poor engine performance in terms of operation and durability. So one need to think about the heat dissipation from the engine cylinder. As far as the heat dissipation is concerned we need to increase the surface area of the cylinder outer body for this reason the fins are used as extended surfaces and the rate of heat transfer to the atmosphere can be increased. In this work we are focusing on the various fin materials for optimum performance of engine. For this we have modeled an IC engine cylinder and then we have applied various fin materials to the cylinder block and we have analysed the block using at actual operating conditions that is maximum internal temperature at the inner side of the cylinder taken as 920 C and convection parameters obtained theoretically for different materials. We will be using CATIA software for the modeling purpose and ANSYS software tool for analysis work. Once we obtain the results we will do a comparative study and will conclude our analysis

A Review Article Enhancement of Image Forgery and Improvement of Image Parameters Using DWT Algorithm
Authors:- Rajni Soni, Asst. Prof. Hemant Amhia

Abstract- This paper presents a new system for integration of a grid-connected photovoltaic (PV) system together with a self-supported dynamic voltage restorer (DVR).Power quality (PQ) is gaining a great deal of importance as more sensitive loads are introduced into the utility grid. The degradation of product quality, damage of equipment and temporary shutdowns are the general issues associated with PQ problems in industries. Any mal-operation or damage of the industrial sensitive loads results in monetary losses disproportionately higher than the severity of the PQ issues. The evolution of power electronics technology replaced the traditional power quality mitigation methods with the introduction of Custom Power System devices (CUPS). The major power electronic controller based CUPS are DSTATCOM, DVR and UPQC. DVR is a pertinent solution for the economic losses caused by the PQ issues in the industries. Among the CUPS, DVR is the most cost-effective one. In the published literature, only a few papers correspond to the review of DVR technology. In this paper, a systematic review of published literature is conducted and a description is given on the design, standards and challenges in the DVR technology. In addition to the energy variability of renewable energy sources, random voltage sags, swells and disruptions are already a major issue in power systems. Recent advances in power electronic devices have provided a platform for new solutions to the voltage support problem in power systems.

A Review On Power Quality Based On Upfc
Authors:- Research Scholar Renu Kesharwani, Asst. Prof. Parikshit Bajpai

Abstract- Power electronic controllers for a flexible ac transmission system (FACTS) can offer a greater control of power flow, secure loading and damping of power system oscillations. A unified power flow controller (UPFC) is a one of FACTS elements that can provide VAR compensation, line impedance control and phase angle shifting. The UPFC consist of two fully controlled inverters, series inverter is connected in series with the transmission line by series transformer, whereas parallel inverter is connected in parallel with the transmission line by parallel transformer. The real and reactive power flow in the transmission line can be controlled by changing the magnitude and phase angle of the injected voltage produced by the series inverter. The basic function of the parallel inverter is to supply the real power demanded by series inverter through the common dc link. The parallel inverter can also generate or absorb controllable reactive power. This paper offers and discusses most papers that used a UPFC to improving the active and reactive power flow of the power systems the unified power flow controller (UPFC) is an advanced member of flexible AC transmission systems (FACTS) group. This paper is focused on three techniques for inclusion of the steady state models of the UPFC in power flow programs. This paper also presents a review of various benefits and applications of UPFC in power flow studies such as minimization of loss, enhancement of load ability, voltage stability etc. using various optimization techniques. A case study is also shown to analysis effect of UPFC using comprehensive NR method based power flow.

Research on Online Crime Server and Management
Authors:- Madhuri Babar, Pranjal Sahare, Rahul Katre, Pankaj Ganvir, Badal Sakharwade, Rani Chikate

Abstract- Nowadays, much of the crimes committed were unreported to the authorities. Given this fact, the study presents the development of Online Crime Server through idea draws its motivation from the inconvenience of going to the police station and personal belief of the weak investigative capabilities of the authorities to resolve petty crimes and limited spreading of crime information to the community. The project specifically looks into the crime detection and prevention. This study aims to provide an overview of the investigative process and, in doing so, identify effective and efficient approaches to the investigation and detection of the volume of crimes. The review is particularly aimed to highlight the research evidence those investigative practices and actions that are likely to lead to a positive outcome. The development of software includes system architecture, flow chart, project module formulation, modules and many more. This also shows that distance is also a factor that influences greatly how crimes are being handled with many crimes going unreported as a result. Crime Server would really help the complainant and the authority to communicate privately and easily with regards to the reported issue. In addition, it would be easier for the complainant to report a witnessed crime without the fear of getting involved in the problems because of the security that the only authorized user can see the report..

COVID-19 Infodemic and the Media in Times of Crisis
Authors:- Dr.Neetu Anand, Yash Pandey, Shyam Sharma

Abstract- The purpose of this research is to evaluate the crucial role of media in information distribution, much as it did in the earlier pandemics of SARS, MERS, and H1N1. The rapid spread of this sickness worldwide caused public concern, and various unknowns about this new pathogen prompted a panic. As a result, the news media became an essential source of knowledge about the new coronavirus; yet, there are numerous benefits and drawbacks to consider. For the first time in history, responsible use of these tools may help quickly distribute crucial information, relevant scientific results, share the diagnosis, cure, and follow-up protocols, and evaluate different techniques globally, reducing geographic borders. We describe the most considerable information on the impact, benefits, and drawbacks of using media networks during the COVID-19 epidemic in this study.

Survey on Cloud Attack Types and Detection Techniques
Authors:- M.Tech. Scholar Rakesh Jat, Asst. Prof. Sumit Sharma

Abstract- Cloud environment gives flexibility to different service and products for easy access. This liability of access increases the vulnerability of attack as well. Many of researchers are working in this field of attack prevention and detection. This paper has detailed a survey on various types of attacks present in the cloud. Attacks are classified into few categories as per nature of intrusion and affects. Such attack detection techniques developed by different scholars are also summarized in the paper for lcear understanding of attack detection features. In oreder to compare twoor more detection algorithms evaluation parameters were also in the paper..

A Method for Privacy-Preserving Authentication Based on Hybrid Cryptography for Vanet
Authors:- M.Tech. Scholar Kumar Manish, Asst.Prof. Dr. Neetesh Raghuwanshi , Asst. Prof. Dr. Bharti Chourasia

Abstract- The Internet of Things (IoT) is a novel system interfacing things, for example, clients, vehicles, and home gadgets, through electronic labels, sensors, actuators, and intuitive programming. IoT guarantees the association and correspondence between the articles by advanced methods. Situations, for example, clever vehicle framework and keen home framework can be progressively advantageous, exhaustive, and wise with the help of IoT innovation.Secure communication between vehicle and Infrastructure/Road side unit (V to I/R) over VANET and identifying accurate attacker vehicle is a major challenge over VANET in modern age. In this thesis, implementing hybrid encryption techniques i.e. AES and RSA algorithm and comparison their performance with previous method based on the analysis of its stimulation time at the time of encryption and decryption process, throughput and also its buffer size experimentally.

Smart Pollution Monitoring System
Authors:-Ayush Kumar Tiwari, Aman Goyal, Akshita Sharma, Pragya Tewari

Abstract- Over the top development in the modern and foundation structure that establishes ecological issues, for example, environmental change, shortcoming and land contamination. Contamination has turned into a significant issue so there is a need to assemble a prosperous framework that conquers the issues and screens the effects of contamination. The arrangement fuses Internet of Things (IoT) innovation which is a connect for PC and gadgets science. It can give ways of checking the nature of ecological boundaries like air, clamor, temperature, stickiness and light. To screen the degrees of contamination of the modern climate or a specific space of interest, a remote inserted PC framework is suggested. The framework utilizes a model execution that incorporates portable hearing assistants, Arduino has a board, ESP8266 as a wi-fi module. These portable hearing assistants are incorporated with a remote implanted PC program to screen the decrease in boundary levels from their typical levels. The point is to make a strong framework for observing ecological limits.

A Review of Glaucoma Detection using Machine Learning
Authors:- PG Scholar Madhup Pandey, Asst. Prof. Dilip Singh Solanki

Abstract- Glaucoma is an infection wherein the optic nerve of the eye gets annihilated. Accordingly, it causes vision misfortune or visual deficiency. Nonetheless, with prior analysis and treatment, eyes can be secured against serious vision misfortune. Most vision misfortune cases because of Glaucoma are preventable if the illness treatment is begun in the beginning phases. More often than not fringe vision can be harmed sooner than a person’s focal vision by Glaucoma since it doesn’t give any indications and side effects. The current systems to recognize Glaucoma are tedious and questionable at the center. We propose a minimal expense Glaucoma discovery framework which is a PC-based innovation and in this way, it utilizes calculations to promptly identify and order solid and Glaucoma eyes. It does this by investigating the region of interest (ROI) of pictures through the execution of different picture extraction highlights like GLCM grid; Wavelet-based Texture highlights like Multiscale Linear Binary example. For the Classification of solid and Glaucoma eyes, we propose a Supervised Machine Learning approach.

Intrusion Detection System using Machine Learning Approach
Authors:- Jyoti Ahirwar , Dr. Mukesh Yadav

Abstract-Random destructive acts for a single computer or for the complete network may be noticed on the internet from time to time. As computer connection continues to expand at an unprecedented rate, it is becoming more difficult to keep up. Security concerns may be noticed on the internet, just as they can be seen in person. The intrusion detection system (IDS) is intended to recognise and examine such hostile behaviours happening throughout a network. The intrusion detection system (IDS) assists in the detection of attacks on the system and the identification of intruders. Various machine learning (ML) approaches have been introduced to intrusion detection systems in the past, with the objective of improving the results for intruder detection and raising the accuracy of the IDS. In this work, we offer a strategy for creating an efficient IDS that takes use of the principle component analysis (PCA) and the CNN classification algorithm. PCA may be used to organise data by lowering its dimensionality, whilst random forest may be used to categorise data. The tests will be carried out using the suggested system over the KDD Knowledge Discovery Dataset Knowledge Discovery Dataset. When compared to other approaches such as SVM, Naive Bayes, and Decision Tree, it is clear that the recommended methodology will perform more efficiently in terms of accuracy. We received the following results utilising our proposed method: performance time (min) is 3.24 minutes, accuracy rate percentage is 96.78 percent, and mistake rate (percentage) is 0.21 percent.

Investigation Of Solar Water Heater Designed Model Using CFD Fluent
Authors:- Konark Tripathi, Asst.Prof. Deepak Solanki

Abstract- Non-renewable energy sources are useful in a crisis situation, when there are problems with energy consumption and the environment. At first, the usage of solar energy was restricted, as was the choice to use it. And this is meant to be a breakthrough in the way that technology interacts with nature, as well as a fresh way to use renewable energy and to make it the main energy source in the future. This project’s goal is to use computational fluid dynamics (CFD) models of various tubes to compare heating methods for solar water heaters. There are three kinds of design for tube form water heaters such straight, ‘S’ pattern, and ‘U’ pattern; the first one has water flow in a straight line, the second has the water moving in a ‘S’ shape, and the third has water moving in a ‘U’ shape. When testing the design, we gathered information on how well it would work. the findings were compared to the CFD results.

A Review On Performance Of Power Quality Improvement Of Hybrid Energy In Grid Connected System
Authors:- Research Scholar Sweta Kumari, Prof.Dr. Manju Gupta, Asst. Prof. MamtaSood, Prof. Dr.Anuprita Mishra

Abstract- This paper presents a review on grid Integration and power quality issues associated with the integration of renewable energy systems in to grid and Role of power electronic devices and Flexible AC Transmission Systems related to these Issues. In this paper, recent trends in power electronics for the integration of wind and photovoltaic (PV) power generators are presented. Discussions about common and future trends in renewable energy systems based on reliability and maturity of each technology are presented. Classification of various Power Quality Issues used by different researchers has been done and put for reference. Application of various techniques as applied to mitigate the different Power Quality problems is also presented for consideration. Power Electronics interface not only plays a very important role in efficient integration of Wind and Solar energy system but also to its effects on the power-system operation especially where the renewable energy source constitutes a significant part of the total system capacity. However there are various issues related to grid integration of RES keeping in the view of aforesaid trends it becomes necessary to investigate the possible solutions for these issues and Power Quality is the main problem in Renewable energy sources. Nowadays there were Scarcity of non-renewable resources and the requirement of consumers was fulfilled by the renewable energy resources. The usage of renewable energy sources are less compared to other energy sources and the renewable energy also causes PQ issues in grid. Some of the issues may be sag, swell, flicker, harmonic, interruptions and voltage imbalance .This review shows what else the issues are caused due to the solar and wind energy while connected to grid and how it can be improved.

Breast Cancer prediction using Machine Learning
Authors:- Naireen Arshad

Abstract- The technology is evolving day by day and it has majorly affected the healthcare sector. A lot of advancements have been made in the healthcare sector which are beneficial for the human beings. All these discoveries and inventions made so far are being done to provide better services to humans and make their life comfortable. Due to the heavy burden that doctors have to face, the IT field has stepped forward to ease up the task of these doctors. People get affected with various different types of diseases and one such disease is Cancer. It is a deadly disease and if not treated at the right time,it might result in the loss of a life. Various tests are carried out to detect its presence but the reports might take time and can lead to the death of the patient.In this paper, we have triedto use Machine learning to detect the cancer found in the breasts of a human being so that the appropriate measures and medications can be started as soon as possible.

Rover Sieth
Authors:- Durgesh Nikam, Priti Yadav, Archana Ramesh, Prof. Sushma Patwardha

Abstract- The main objective behind this paper is to develop a robot to perform the act of surveillance in domestic areas. Nowadays robot plays a vital role in our day to day life activities thus reducing human labor and human error. Robots can be manually controlled or can be automatic based on the requirement. The purpose of this robot is to roam around and provide audio and video information from the given environment and to send that obtained information to the user. In this project, one can control the robot with the help of mobile or laptop through Internet of Things and also can get the live streaming of video both in daytime as well as at night with the help of wireless camera from the robot. The robot can be controlled both in manual as well as in automated mode with the help of Arduino microcontroller. This robot also uses various sensors that collects data and sends it to the Arduino microcontroller which controls the robot .Thus the action of surveillance can be performed. Further advancement in our project can provide surveillance even in defense areas.

Detection & Solution for Out of Step Problem Based on Optimal Location of PMU
Authors:-Mohamed Reda Elshahat, Dr. Adel Ali Emery
,Prof. Dr. Saady Abd El Hameed El Sayed

Abstract- Power swing is disturbance done at small part of real grid and case a totally/partial black out of the electrical systems. This paper introduces a real case study for a big power swing event that detected by conventional method of out of step. The relay disconnects the most linewhich has suffered from oscillation but oscillation was moved to rest of power system and did complete black out. A new proposed technique was introduced in this paper to solve the problem of transfer the oscillation to the rest of power network. This technique is based on a developed program usingpower system simulator for engineering (PSSE), power measurement units (PMUs), and artificial intelligence (AI). The technique succeeds to solve the problem without transfer to the rest of the system. The paper also introduced the optimal number and locations of PMUs. in addition, PMUs’s are applied to AI system to determine which buses or lines responsible on generate the first oscillation condition.

Air Canvas :Draw in Air Using AI
Authors:- Prof.Hemlata, A Shinde, Shravani M. Jagtap, Anushka A.Kalpund, Pranita B. More, Ayushi A. Parkale

Abstract- Drawing in Air has been one of the most fascinating and interesting research areas in the field of visual pattern recognition. Here, visual pattern recognition means to recognize movement of finger tips. It improves the interaction between man and computer in various application. This idea will help in achieving the naturalness desired for Human Computer Interaction(HCI). Proposed method have two main tasks: first it tracks the fingers tip and second it plots the co-ordinates of finger-tipon the screen in any desired colour. It does not require any keypad, pen or glove rather than a camera. This idea of Air Canvas is beyond the traditional empty(white), rectangular and flat-dimensional canvas seen in traditional artworks. We are applying the techniques of computer vision in OpenCV to build this project. To achieve the goal of this project, the finger-tip tracking and detection process are used.Air canvas refers to virtually drawing through hand gesture on the air without touching anything which is recommended during COVID-19. This project will be a powerful means of communication for the deaf,specially abled, senior citizens and children’s for educational purposes.

Research Paper: The Study on Impact of Artificial Intelligence on Innovation
Authors:- Alka Sharma

Abstract- This research paper draws a conclusion on the impact of artificial intelligence on innovation. For this purpose, a secondary research work has been done within which the ideas presented by different researchers through their relevant research works has been referred by the researcher to draw a conclusion in the support of research topic. The conclusion presented suggests that there is a significant impact of artificial intelligence on innovation that enables the modern business organizations to present themselves in a different light than their competitors in terms of uniqueness.

A Review Article Modelling of CMOS based Highly Sensitive Mems Designing and Reducing Noise Signal and Also Enhancement Its Performance
Authors:- Bipin Singh, Asst. Prof. Hemant Amhia

Abstract- This review article through light on a highly promising & demanding technology, which is set to revolutionize nearly every product category in present era, while discussing the Concept, Design & Development, Fabrication techniques and applications of micro electro-mechanical systems (MEMS) based Devices or systems. Microelectromechanical system discloses outstanding flexibility and adaptability in miniaturization devices followed by their compact dimension, low power consumption, and fine performance. The MEMS devices have numerous and very high potentials of creating a new field of applications for mobile equipment’s with increased flexibility & more reliability. This work deals with research carried out for the development of MEMS based sensors & Actuators and appropriate uses of MEMS. This work carries information’s regarding subsequent commercial and real life applications of MEMS and discusses various recent technological innovations carried out with their advantages & disadvantages. This work also describes the historical development of micro-electromechanical system (MEMS) sensor technology.

A Review ArticlePower Scheduling and Utility of Grid Demand Side Management Using Fact Power Controller
Authors:- Preeti Kourav, Dr. Anil Kumar Kori

Abstract- Grids are considered as the basic and fundamental technology through which environmental pollution and the user’s energy cost is reduced. The management of smart grids is done by various demands Side management (DSM) techniques to ensure that there is an efficient flow of power. But it is a complex task in real time as energy demands of consumers rise continuously in an unpredicted manner. A literature survey is conducted to get an overview about the role of heuristic techniques in demand side management. The review states that such algorithms are able to schedule the power cuts in an effective way which in turn minimizes the load on the power grids. But as there are number of heuristic algorithms available it will be a challenge to select the efficient approach using Bus system. Moreover, the important factors such as load, cost etc. are also drawn out from the survey to help the future research to give an efficient DSM system.

Mental Workload Detection from WESAD dataset Using Machine and Deep Learning Model : A Review
Authors:- Yashvant Dev, Mayank Namdev, Dr. Ritu Shrivastava, Dr. Rajiv Srivastava

Abstract- Mental stress is one of the major contributors to a variety of health issues. Various measures have been created by scientists and medics to determine the intensity of mental stress in its early phases. To assess mental stress in the workplace, several neuroimaging methods have been developed. One key candidate is the electroencephalogram (EEG) signal, which offers a wealth of information regarding mental states and conditions. We analyse trustworthy heart rate variability (HRV) metrics in order to detect stress in this study.We refer WESAD dataset, an experiment protocol was established including two different sensors which correspond to a range of everyday life conditions.We present our work on the development of a stress detection system based on heart rate by calculating and comparing HRV features from time and frequency domain analysis and classifying these features with the machine learning and deep learning algorithms.

A Review on Thermal Analysis of Shell and Tube Heat Exchanger with Different Design of Baffle Plate
Authors:- M.Tech. Scholar Sonu Kumar, Prof. Neetesh Gupta

Abstract- A heat exchanger may be defined as a device that transmits thermal energy between two or more fluids of varying temperatures. Several industrial processes would indeed be impossible to complete without this equipment.Refrigeration, air conditioning, and chemical plants all use heat exchangers. It’s utilised for a variety of things, including transferring heat from a hot to a cold fluid. They’re commonly employed in a variety of industrial settings.Researchers had worked on a variety of projects in attempt to increase performance. The velocity and temperature contour fields upon that shell side, on the other hand, are much more complicated, and their performance is influenced by baffle elements such as their arrangement the spacing scheme.

Numerical Analysis of Flow and Heat Transfer Enhancement in a Pipe with Twisted Tape
Authors:- M. Tech Scholar Kanhaiya Kumar, Prof. Neetesh Gupta

Abstract- This work aims to present a numerical model for heat transfer intensification in a heat exchanger tube equipped with novel V-cut twisted tape. The effects of different cut ratios (0.6<b/c<1.25) on the turbulent flow characteristics and thermal performance of the system will be investigated over the Reynolds number range from 4000 to 12000. All the simulation will be performed for fully developed turbulent flux in the Reynolds number range with uniform heat flux of 5000 W/m2. The numerical results of heat transfer (Nusselt number, Nu), pressure drop (friction factor, f) and enhancement Performance Factor in a tube with twisted tapes (V-Cut) were reported in the study.

A Review Article of ANN Based Combined Transformer Error Identification in Primary Load Imbalance Conditioning
Authors:- Neelam Katare, Dr. Anil Kumar Kori

Abstract- The review of Combine transformer is an electrical equipment that needs continuous monitoring and fast protection since it is very expensive and essential element for power system to perform effectively. Various methods for the protection are available. Most of the methods should be known for the protection of the transformer before practically protecting the same. Here a review is presented for various methods available for the protection of transformer. The most common protection technique used is the percentage differential logic, which provides discrimination between different operating conditions and internal fault. Some condition as, inrush current and CT saturation can cause mis-operation of differential protection.

A Review Article Classification and Recognition of Soybean Leaf Disease Detection Using Convolutional Neural Network (CNN)
Authors:- Bhanu Pratap Singh, Dr. Shailja Shukla

Abstract- Agriculture is the backbone of every economy on the planet. Crop output is one of the most important aspects influencing a country’s domestic market situation. Agricultural output is also a critical component of economic development in any country. It is vital because it offers raw materials, jobs, and food to a variety of citizens. Many factors contribute to the disparity in crop production estimates across the globe. Overuse of chemical fertilizers, the presence of chemicals in water supplies, irregular rainfall distribution, varying soil fertility, and other factors are among them. Aside from these concerns, one of the most common challenges around the world is the destruction of a large portion of production due to diseases.

An Efficient Approach for Reversible Realization of 8- Bit Adder-Subtractor Circuit
Authors:- Research Scholar Sabiha Fatima, Associate Prof. Dr. Bharti Chourasia

Abstract- Full Adder is the heart of any central processing unit that is a core component employed in all the processors. The approach to minimize power loss from digital devices made researchers to focus on reversible logic. This paper presents design approaches for reversible realization of 8-bit adder-subtractor circuit with optimized quantum cost. This design is compared with existing designs on some selected performance parameters such as total number of reversible gates, garbage outputs and quantum cost. The proposed design for 8-bit adder-subtractor circuit using reversible approach simulated using Modelsim tool and synthesised for Xilinx ISE 14.7.

Mnist Handwritten Digit’s Recognition
Authors:- Ms.Pragya Tewari, Yogesh Devtulla, Sumit Baroniya, Rishika Raj

Abstract- As we know that our country has a huge population and thus, we have different varieties in handwriting. Therefore handwritten digits recognition is one of the difficult tasks for intelligent systems like computer. Thus, we implementing handwritten digits recognition which recognise the handwritten digits with the help of Artificial & Machine learning Algorithm and database likes MNIST (modified national institute of standard technology). The main Aim of “Handwritten digits recognition” is to recognise the human handwritten digits from different sources like Image, Paper, Touchscreen and such like, and classify them into 10 pre-defined classes like (9 – 0). The objective of this paper is to notice the variety of correctness’s of CNN for fetching digits written by hand utilizing different hidden layers approach to be exactness, and the main use of this is to help in recognising digits in doughnut-like software where we use input from scanning picture like sources.

Automated Mental Illness Analysis Using Voice Samples
Authors:- Sowhardh Honnappa Gowda

Abstract- One in 24 suffers from critical mental illness like Schizophrenia, Psychosis, Clinical Depression, Anxiety Disorder,Obsessive Compulsion Disorder (OCD), Autism, Bipolar Disorder, Attention Deficit Hyperactivity Disorder (ADHD) etc. found that the average Vector Similarity between adjacent sentences in free speech, along with other variables like Number of words/phrases, pauses, tone, intensity, frequency and other Low- Level Descriptors form the raw audio recording could be used to identify clinically high-risk patients with great accuracy. Audio visual hallucination and thought insertion appear to be the top side effects in case of patients suffering from Schizophrenia [3]. Acoustic studies between healthy and depressed individuals [4] shows us that the top audio features which help identify depression in mental illnesses are Loudness, MFCC5 and MFCC7. One of the studies dealing with “Automated Depression detection using Audio features” [5], suggests that the lacking objective Clinical depression assessment methods is the key reason that several patients can’t be treated appropriately on time. This study aims to find an optimal approach to calculate depression scores amongst people suffering from mental illnesses using Artificial Intelligence techniques.

Design and Analysis of L Band Microstrip Patch Antenna for Global Navigation Satellite System
Authors:- M. Tech. Scholar Shashi Mishra, Associate Prof. Dr Bharti Chourasia

Abstract- In this paper the design consideration for the rectangular micro-strip antenna has been presented. In modern wireless communication systems, the micro-strip patch antennas are commonly used in the wireless devices. Therefore, the miniaturization of the antenna has become an important issue in reducing the volume of entire communication system. The various parameters of rectangular micro-strip antenna, input impedance, VSWR, return loss, radiation pattern have been investigated as a function of frequency for proper matching and radiations. It uses L band, X band, C band, Ku band, Ka band for global navigation satellite system (GNSS) communication application. L band frequency range is 1- 2GHz so it can be used in lower frequency range communication. It is proved that this proposed method has better than traditional methods.

Design and Implementation of Solar Powered BLDC Motor Driven Electric Vehicle
Authors:- Jaysingh Prajapati, Devendra Dohare

Abstract- The solar energy is used to feed the Brushless DC motor which is operated using four switch models instead of conventional six switches using PID fuzzy logic controller to have better speed accuracy Any equipment without power is an idle bunch of components. It is very prominent with those dependable upon the non-renewable sources. It’s a pro-active approach to shift our source of energy to renewable source. This paper details the study of designing a Solar Powered BLDC Motor Driven Electric Vehicle which is one of the solutions for the oncoming crisis.The approach of selecting the appropriate components for this application is studied and each of them are simulated and subjected to various tests in real time environment. The integrated system consisting of the solar module, charge controllers, batteries, boost converter and BLDC motor, henceforth developed into the Solar Powered Electric Vehicle. The charge controllers direct this power acquired from the solar panel to the batteries. According to the state of the battery, the charging is done, so as to avoid overcharging and deep discharge. Primarily trying to increase the range of the electric vehicle.

Bio-Geography Based Page Prediction Using Web Mining Feature
Authors:- Trivene Khede, Dr. Avinash Sharma

Abstract-Website is god place to reach the audience of any field. Many of companies are using these platform for different business. Retaining a web visitor on website depends on available content and intelligence of site. This paper has developed a intelligent model that can predict the web page by understanding the behavior of the user. Biogeography optimization genetic algorithm was used to predict the web page as per past user visits. This work uses web content and web log feature of the website for evaluating the fitness value of genetic algorithm chromosomes. Experiment was done on real dataset with different size. Result shows that proposed model has improved values of different evaluation parameters.

Analysis of students’ critical thinking skills using Data Mining approaches (Survey based research)
Authors:-Dr. M. Pushpalatha (Prof. And HOD), Raya Bandyopadhyay

Abstract- Our aim in this project, will be to identify Key Performance Indexes that can define a student’s level of comprehension after studying certain classes and how we can apply those Key Performance Indexes in classification or even use them to calculate the success rate of skills, in universities or even place specific students in areas where they can succeed. We can also analyse which data mining algorithm gives us the highest accuracy based on our data and, address some of the open problems we may encounter as we go along, based on existing research literature. Understanding the learner’s in-depth thinking process after a lesson or series of lessons, will give us more information about where the student is lacking or whether the skills are lacking, in the event that most students seem to lack a certain pattern. This shall enable more fluid methods for students and academics to be classified into a system, we can categorize them based on class performances or regular assignments, and find a system which shall give us an understanding about the grasp of a particular student in a certain subject and eventually, the group of students performing well in certain subjects can be placed in opportunities which shall enhance their skill sets and help them pick a customized career for them. The use of multi-phase analysis and cluster analysis is intended to be based on data on which Key Performance Indexes will be determined at the end. Based on these determining Key Performance Indexes, we can access important information and, if possible, present it on a working dashboard.

A Review on Plant Disease Detection Techniques
Authors:- Samvedya Jedhe deshmukh, Sahil Kachole , Nishant Parakh , Umesh Chaudhari

Abstract- Agricultural production is one of the most important sources of revenue for the economy. Agriculture is regarded as one of the most important pillars of the Indian economy. Agriculture contributes significantly to the country’s GDP and gives employment to a huge number of individuals in the farming industry. Plant disease disrupts normal plant growth and is one of the leading causes of lower production, which leads to economic losses. Early detection of disease aids in the development of treatments that can slow the spread of disease in plants. One of the reasons why plant leaf disease detection is so crucial in agriculture is because of this. Leaf inspection is regarded as one of the most effective methods for diagnosing plant diseases. Computer vision and machine learning techniques are extremely beneficial for identifying and comprehending data from digital photographs. The main focus of the research is on different algorithms for detecting plant illness. This research will assist researchers and students in selecting the optimum algorithm based on previous research.

Survey on Medical Image Diagnosis Techniques and Features
Authors:- Samvedya Jedhe deshmukh, Sahil Kachole , Nishant Parakh , Umesh Chaudhari

Abstract- The ability to acquire images from inside a patient has revolutionized the way doctors diagnose and treat diseases, with almost all clinical pipelines now involving imaging to some degree. The development of these imaging methods has led to the field of medical image computing, where a multitude of tools and techniques have been proposed to aid clinicians and researchers in interpreting and analyzing these images. Most of researcher has proposed various techniques on single disease type either tumor, covid, malaria, heart attack, etc. This paper has summarized some of image features used for the medical report diagnosis. Techniques proposed by various scholars are also detailed in the paper. Various types of medical imaging techniques were also brief. Evaluation parameters used for comparison of methods were also mention in the paper.

Integrating Risk Management: Technology’s Role in Bridging FinTech and Traditional Banks
Authors:- Chintamani Bagwe

Abstract- This article overviews risk management strategies details in the financial industry landscape, comparing the practices of risk management between FinTech companies and traditional banking taking into account the regulatory landscape that embraces technological changes. Risk management has always been central to the financial sector, ensuring the stability and safety of all stakeholders involved. FinTech companies, enabling most traditional banking services through technology, introduce unique offerings that come with new risk types such as cybersecurity and regulatory risk that require widely different mitigation measures. Some of the risk’s responses include automation, flexible ecosystem formation, and creating customer-focused entities. Traditional banks, in contrast, have developed time-proven risk management schemas that focus on enterprise risk management to regulatory compliance to risk aversion. However, they face their own complexity with the immense burden of maintaining compliance and integrating existing technology with emerging tech. The collaboration with Fintech is a way out through the cross-role processes. Furthermore, the regulatory space will also see change regulations on FinTech firms and traditional banks that will have to adapt to continue operations. In the future, existing and new technologies will be integrated into the space, resulting in better risk management schemas. Successful partnerships between a bank and a traditional bank are shared at the end. The collaboration between FinTech companies and traditional banks will be essential to survive the current revolution.

DOI: 10.61137/ijsret.vol.10.issue5.793

Data Warehouse Modernization for Insurance: Integrating AI and Cloud Technologies
Authors:- Srinivasa Chakravarthy Seethala

Abstract- The insurance sector faces mounting challenges from regulatory changes, competitive pressures, and the demand for real-time data insights. Traditional data warehouses, essential for data storage and retrieval, often lack the flexibility, speed, and scalability required by modern insurance operations. This article examines how integrating Artificial Intelligence (AI) and cloud technologies can drive data warehouse modernization for insurers, delivering real-time decision-making capabilities, optimized data management, and enhanced operational efficiency. We explore methodologies, technologies, and case studies that demonstrate the transformative impact of AI and cloud in modernizing legacy data warehouses in the insurance sector.

DOI: 10.61137/ijsret.vol.10.issue5.794

Future of Supply Chains: Trends in Automation, Globalization, and Sustainability
Authors:- Lakshmi Kalyani Chinthala

Abstract- The future of supply chains is being reshaped by a combination of technological advancements, globalization, and an increasing focus on sustainability. This paper examines the current trends that are influencing supply chain management and how businesses are adapting to these changes to remain competitive. Automation is playing a pivotal role in enhancing the efficiency and flexibility of supply chains, reducing costs, and improving accuracy. The integration of technologies such as artificial intelligence (AI), robotics, and the Internet of Things (IoT) is transforming traditional supply chain models, enabling real-time tracking, predictive analytics, and autonomous operations. At the same time, globalization has led to more complex supply chains, with companies sourcing materials and products from across the globe. While this has resulted in cost efficiencies, it has also introduced new challenges related to supply chain visibility, risk management, and geopolitical factors. Sustainability is another key trend, as businesses are under increasing pressure to reduce their environmental impact, adopt responsible sourcing practices, and promote ethical labor standards. The paper explores how companies are navigating these trends and highlights the importance of integrating automation, globalization, and sustainability into supply chain strategies to build resilience and achieve long-term success.

DOI: 10.61137/ijsret.vol.10.issue5.795

Disaster Recovery Planning For Hybrid Solaris And Linux Infrastructures

Authors: Sambasiva Rao Madamanchi

Abstract: Disaster recovery (DR) planning for hybrid infrastructures that combine Solaris and Linux poses unique challenges due to differences in tooling, system architecture, and operational practices. Solaris often supports legacy, mission-critical applications, while Linux drives modern, scalable workloads. This document provides a comprehensive guide to building a resilient DR strategy across both platforms. Key areas include risk assessment, backup and recovery tooling, system state preservation, and application/database restoration. Emphasis is placed on automation through Ansible, shell, and Python scripts, as well as configuration management and monitoring integration. The guide also highlights best practices such as maintaining consistent time and user IDs, isolating recovery zones, and leveraging enterprise backup solutions. Through clear documentation, defined team roles, and routine testing, organizations can achieve a DR framework that is platform-aware, repeatable, and aligned with evolving operational and compliance requirements.

DOI: https://doi.org/10.5281/zenodo.15771601

 

Predictive Maintenance Modeling in Solaris and Red Hat Platforms

Authors: Albert Joshep

Abstract: Predictive maintenance is an emerging discipline that combines system telemetry, machine learning, and automation to preemptively identify and resolve failures in complex computing environments. This review explores the implementation of predictive maintenance in Solaris and Red Hat Enterprise Linux (RHEL) platforms two prominent Unix-based systems widely deployed across enterprise IT landscapes. By comparing architectural features, telemetry sources, and modeling techniques, the study highlights both the unique capabilities and challenges presented by each operating system. Solaris benefits from a robust fault management architecture (FMA), advanced diagnostics like DTrace, and SPARC hardware optimization, making it well-suited for hardware-level monitoring. Red Hat, on the other hand, excels in automation, scalability, and hybrid cloud compatibility through tools such as Red Hat Insights, Ansible, and Performance Co-Pilot. The article delves into key predictive modeling strategies including time-series forecasting, anomaly detection, and classification, utilizing methods ranging from ARIMA and Isolation Forests to neural networks. Integration and automation workflows are examined, showcasing how Unix-native tools and open-source frameworks are used to train, deploy, and act upon model predictions. Through case studies, the review quantifies the benefits of predictive maintenance, including reduced mean time to recovery (MTTR), enhanced SLA adherence, and cost savings. Finally, it discusses limitations such as data inconsistency, model drift, and cross-platform transferability, while outlining future directions including AI co-pilots, self-learning systems, and Predictive Maintenance-as-a-Service (PMaaS). By offering a detailed comparative analysis and strategic recommendations, this review serves as a practical guide for enterprises aiming to implement or enhance predictive maintenance in mixed Unix environments.

DOI: https://doi.org/10.5281/zenodo.15798515

The Use Of Scalable Disaster Recovery Architectures For Hybrid UNIX Systems

Authors: Hamid Ansari

Abstract: In today’s digital landscape, enterprise IT environments demand resilient and scalable disaster recovery (DR) solutions, especially in hybrid UNIX systems where Solaris, AIX, HP-UX, and Linux coexist. These systems often run critical workloads in sectors like finance, healthcare, telecommunications, and government, necessitating DR architectures that ensure high availability, data integrity, and business continuity across heterogeneous platforms. This review provides a comprehensive analysis of scalable DR architectures tailored for hybrid UNIX environments, addressing the complex interplay between storage replication, backup strategies, orchestration tools, and operating system-level recovery mechanisms. Key architectural patterns such as active-active and multi-site replication models are examined alongside file system-level and block-level replication technologies including ZFS send/receive, Veritas Volume Replicator, and SAN mirroring solutions. The paper compares OS-specific recovery tools like Ignite-UX, mksysb, and Solaris Unified Archives, and assesses their interoperability in multi-vendor environments. Further, the study explores the orchestration layer of disaster recovery, highlighting the role of configuration management and automation tools like Ansible, Puppet, and scripting frameworks. Monitoring, testing, and policy-driven recovery are addressed as essential pillars of a sustainable DR strategy. Real-world case studies are analyzed to illustrate practical implementations, performance outcomes, and lessons learned in deploying scalable DR across diverse UNIX infrastructures. Challenges such as format incompatibility, network reconfiguration, and security hardening are critically discussed. Finally, the review anticipates emerging trends, including the use of AI/ML for proactive fault prediction and the integration of DR into continuous compliance and observability pipelines. This article serves as a reference for system architects, disaster recovery planners, and enterprise IT professionals seeking to build resilient, automated, and cross-platform DR frameworks for UNIX-centric infrastructures.

DOI: http://doi.org/10.5281/zenodo.15798539

The Recent Automating System Patching Via Satellite And Puppet Integration

Authors: Usha Rani

Abstract: – In today’s dynamic enterprise IT landscape, system patching is a critical operation that ensures security, compliance, and performance. Manual patching processes are often fraught with delays, configuration drift, and inconsistencies, leading to potential security breaches and downtime. Automating this process using integrated tools like Red Hat Satellite and Puppet significantly enhances lifecycle management by aligning system states with organizational policies. Red Hat Satellite offers a centralized platform for managing Linux content, lifecycle environments, and host registration, while Puppet provides robust configuration management capabilities for enforcing desired system states. Together, they enable enterprises to deploy, audit, and maintain patches consistently across vast infrastructure landscapes. This review explores the symbiotic relationship between Satellite and Puppet, focusing on how their integration delivers operational efficiency and compliance. It discusses the underlying architecture of each tool, the mechanics of their integration, and the workflow that governs automated patching. The study highlights key functionalities such as content views, CVE mapping, node classification, and patch window orchestration. Additionally, the review presents real-world case studies from financial services, healthcare, and telecom sectors that have adopted this integration for scalable and secure patch management. The article also identifies challenges in implementation, including integration complexity, legacy system compatibility, and potential risks from misclassification or dependency conflicts. Future trends are examined, including the use of AI/ML for predictive patching, ChatOps for collaborative operations, and declarative frameworks for Patch as Code strategies. In conclusion, the integrated use of Satellite and Puppet forms a cornerstone for secure, compliant, and cost-effective system maintenance, empowering IT organizations to proactively manage vulnerabilities while reducing operational overhead.

DOI: https://doi.org/10.5281/zenodo.15798673

 

The Recent Concept Of LDOM And GDOM Automation Strategies In Oracle Solaris

Authors: Dhanush Aradhya

Abstract: In enterprise IT environments, efficient server virtualization and domain management are crucial to optimizing hardware utilization, operational agility, and high availability. Oracle Solaris, a flagship Unix operating system renowned for its scalability and security, introduces virtualization constructs such as Logical Domains (LDOMs) and Guest Domains (GDOMs) through its Oracle VM Server for SPARC architecture. These constructs enable fine-grained partitioning of system resources on SPARC hardware, allowing multiple independent OS instances to coexist on a single physical server. However, as the number of domains increases in enterprise deployments, manual provisioning and management become unsustainable. This has led to a growing need for robust automation strategies that can orchestrate domain lifecycle operations with consistency, speed, and minimal administrative overhead.This review article comprehensively examines the architectural principles, automation tools, and orchestration strategies used to manage LDOMs and GDOMs in Oracle Solaris environments. It begins with a detailed explanation of the virtualization framework in Solaris, followed by an exploration of domain architecture and the challenges posed by manual administration. Native tools such as ldm, SMF (Service Management Facility), FMA (Fault Management Architecture), and ZFS are discussed alongside automation methods using Bash and Python scripting. Further, the article evaluates how Oracle tools like Oracle Enterprise Manager and external platforms like Ansible are used to automate provisioning, monitoring, backup, and fault handling for LDOM and GDOM configurations.Real-world case studies illustrate the implementation of these strategies in telecom and financial sectors, highlighting time savings, improved uptime, and reduced human error. The article also discusses the challenges faced during automation, including compatibility issues, security risks, and integration bottlenecks. Looking ahead, it explores the future of AI-driven domain orchestration, RESTful automation interfaces, and hybrid cloud integration. This review provides a strategic and technical foundation for IT architects, system administrators, and automation engineers aiming to optimize their Solaris virtualization environments through effective LDOM and GDOM automation strategies.

DOI: http://doi.org/10.5281/zenodo.15798687

 

 

Linux & Unix System Administration AI-Augmented Troubleshooting In Multi-OS Unix Environments

Authors: Ganapathi Basu

Abstract: The increasing operational complexity of multi-OS Unix environments comprising legacy and modern systems such as Solaris, AIX, HP-UX, Linux, and BSD poses significant challenges for traditional system troubleshooting methodologies. These environments demand high availability, rapid diagnostics, and platform-agnostic observability, which are difficult to achieve using manual scripting and OS-specific tools alone. This review examines how Artificial Intelligence (AI) augments system administration by enabling intelligent diagnostics, predictive monitoring, and automated remediation across heterogeneous Unix infrastructures.Beginning with an overview of Unix's architectural evolution and the interoperability challenges in multi-OS deployments, the article outlines the limitations of conventional troubleshooting practices, including shell-based diagnostics, tribal knowledge, and siloed toolsets. It then explores the application of AI techniques such as machine learning for anomaly detection, natural language processing for log interpretation, and reinforcement learning for adaptive, self-healing responses. AI enables powerful capabilities in log normalization, root cause analysis (RCA), and event correlation especially critical in reducing alert fatigue and accelerating fault isolation. Advanced use cases such as predictive failure detection, behavior modeling, and AI-enhanced capacity planning illustrate the potential of intelligent monitoring. The review further evaluates unified diagnostic platforms like Splunk and Dynatrace, cross-platform frameworks, and real-world AI deployments in multi-OS settings. Key deployment challenges such as data silos, model generalization, and explainability are addressed alongside recommendations for integration with ITSM and DevSecOps pipelines. Emerging trends including AI co-pilots for system administrators, AIOps automation, and observability-as-a-service reflect a future where AI transforms Unix operations from reactive maintenance to autonomous infrastructure resilience. The paper concludes by emphasizing the importance of augmented intelligence where human expertise is amplified, not replaced offering a practical roadmap for AI-driven modernization in Unix ecosystems

DOI: http://doi.org/

 

 

Cloud Migration Strategies For Hybrid Enterprises: Lessons From AWS And GCP Infrastructure Transitions

Authors: Harish Govinda Gowda

Abstract: Hybrid enterprises are increasingly adopting cloud computing to modernize legacy systems, enhance scalability, and improve operational agility. However, transitioning to platforms like AWS and GCP involves more than simply shifting workloads—it requires strategic planning, robust security practices, and effective operational models that can support both on-premise and cloud-native systems. This article explores a comprehensive framework for successful cloud migration within hybrid environments, drawing from real-world case studies and best practices. Topics covered include cloud readiness assessment, phased workload migration strategies, network integration patterns, identity and access management, security and compliance alignment, and cloud-native operations. We also examine the unique hybrid capabilities offered by AWS and GCP, including Direct Connect, Interconnect, Anthos, and Outposts, and how these can be leveraged for seamless application continuity. Real enterprise case studies highlight key lessons learned, such as the importance of governance through Cloud Centers of Excellence, the role of infrastructure as code, and the value of unified observability.

DOI: https://doi.org/10.5281/zenodo.15916724

 

Multi-Fact Table Modeling in Power BI: Enhancing Analytical Depth in Complex Pharma Dashboards

Authors: Ajay Kumar Kota

Abstract: In the pharmaceutical industry, data complexity and fragmentation pose significant challenges to delivering unified, actionable insights. Traditional single-fact table models in Power BI often fall short when integrating data from multiple domains such as sales, prescriptions, marketing, and clinical operations. This article explores the strategy and implementation of multi-fact table modeling in Power BI, a powerful approach to unify diverse datasets while preserving analytical integrity and granularity. We examine the architecture of multi-fact models, the challenges of schema design and granularity alignment, and the importance of shared dimensions and bridge tables. The article also delves into advanced DAX techniques for managing filter contexts, optimizing performance, and delivering accurate KPIs across disparate data sources. A real-world pharmaceutical case study is included to illustrate practical applications and business impact.

DOI: https://doi.org/10.5281/zenodo.16024582

Integrating AI Workflows with Health Informatics Pipelines: Opportunities and Challenges

Authors: Guram Shalvovich Danelia, Nino Giorgievna Kalandadze, Levan Besarionovich Mchedlidze, Salome Iraklievna Tsereteli

Abstract: The convergence of artificial intelligence (AI) and health informatics has the potential to revolutionize clinical decision-making, disease surveillance, and personalized medicine. This study explores the integration of AI workflows with existing health informatics pipelines, examining both the transformative opportunities and the critical challenges associated with such integration. By analyzing case studies from electronic health record (EHR) systems, bioinformatics pipelines, and radiological imaging networks, we identify architectural patterns that enable seamless AI integration. Additionally, the research addresses the barriers posed by data heterogeneity, workflow fragmentation, regulatory compliance, and algorithm interpretability. The findings suggest that while AI offers immense benefits in improving healthcare outcomes and operational efficiency, a strategic, interoperable, and ethically grounded approach is necessary for scalable implementation in health informatics infrastructures.

DOI: https://doi.org/10.5281/zenodo.16315871

Edge-AI and Myobioscan Devices: Towards Real-Time Clinical Insights

Authors: Oleh Mykhailovych Hrytsenko, Iryna Volodymyrivna Lysenko, Denys Ivanovych Sydorenko, Viktoriia Andriivna Kravets

Abstract: The integration of Edge-AI with wearable biomedical devices like Myobioscan is reshaping the landscape of real-time clinical diagnostics and patient monitoring. This paper explores how embedding artificial intelligence at the device edge enables low-latency, high-frequency processing of biosignals such as electromyography (EMG), electrocardiography (ECG), and motion patterns. The combination of Myobioscan’s compact sensor technology with on-device AI accelerators facilitates proactive health assessments, early anomaly detection, and decentralized clinical interventions. By reviewing recent deployments and experimental models, this study identifies key performance metrics, data handling architectures, and regulatory considerations in deploying Edge-AI for mobile health. The findings point toward a scalable and responsive healthcare ecosystem driven by distributed intelligence at the physiological interface.

DOI: https://doi.org/10.5281/zenodo.16352134

Machine Learning Models on LDOM-Enhanced Biomedical Server Environments

Authors: Zamira Sadullaevna Rajabova, Otabek Abduvohidovich Madrahimov, Dilshod Jamolovich Saidov, Malika Rasulovna Kadirova

Abstract: The evolution of biomedical data analytics has been closely tied to the scalability and reliability of server infrastructure. Logical Domains (LDOMs), a virtualization technology native to Oracle Solaris, offer hardware-level isolation and performance efficiency that align well with the computational demands of machine learning (ML) in biomedical applications. This research investigates the deployment, optimization, and execution of various ML models within LDOM-enhanced server environments specifically tailored for high-throughput biomedical workloads. It evaluates the architectural benefits, virtualization overhead, and performance stability when applying ML algorithms for genomics, diagnostics, and health informatics. The findings suggest that LDOM-based infrastructures not only support secure multitenancy for ML pipelines but also enable tunable resource allocation strategies for precision performance in real-time medical contexts.

DOI: https://doi.org/10.5281/zenodo.16352646

Unified Architecture for Genomic Data Analytics in Hybrid Cloud Systems

Authors: Artur Eduardovich Karapetyan, Lusine Rafikovna Minasyan, Hovhannes Grigorievich Manukyan, Ani Serobovna Avetisyan, Vardan Levonovich Sahakyan

Abstract: The exponential growth of genomic data presents immense challenges in terms of storage, processing, and analytics. Hybrid cloud systems—combining on-premises resources with scalable cloud services—offer a compelling solution for addressing these computational demands. This paper presents a unified architectural model designed to optimize genomic data analytics in hybrid cloud environments. By integrating containerized bioinformatics workflows, secure data orchestration mechanisms, and AI-driven scheduling, the proposed framework ensures agility, scalability, and compliance. We explore the role of cloud bursting for peak genomic analysis workloads, address data residency and regulatory concerns, and demonstrate performance improvements across typical use cases such as variant calling and gene expression analysis. This architecture supports real-time analytics, secure collaboration, and cross-institutional data sharing in the genomics domain.

DOI: https://doi.org/10.5281/zenodo.16354292

Nanoparticle-Induced Stress In Environmental Microbiomes: Ecotoxicological Perspectives

Authors: Basant Kumar Sahu, Lata Pradhan

Abstract: The increasing use of engineered nanoparticles (NPs) across consumer products, medicine, and industrial applications has led to their unintended release into natural ecosystems, sparking ecotoxicological concerns. Due to their small size, high surface area, and reactivity, nanoparticles interact uniquely with microorganisms in soil, water, and sediment ecosystems. These environmental microbiomes—complex networks of bacteria, archaea, fungi, and protozoa—play essential roles in nutrient cycling, decomposition, and pollutant degradation. However, exposure to nanoparticles often results in oxidative stress, disruption of cellular membranes, genotoxicity, and changes in metabolic functions. Such stress responses can reduce microbial diversity, impair ecosystem processes, and destabilize trophic networks. Despite these critical risks, traditional environmental risk assessments fail to incorporate microbial endpoints, focusing instead on higher organisms. This review explores the pathways through which nanoparticles induce stress in microbiomes, the ecological consequences of such interactions, and the current limitations in detection and regulation. Emphasis is placed on using omics tools and community-level bioindicators to assess sub-lethal effects. Addressing nanoparticle impacts at the microbial level is vital for maintaining ecological balance and sustainability. The paper concludes by recommending policy frameworks and green nanotechnologies that prioritize microbiome integrity in environmental safety assessments.

DOI: https://doi.org/10.5281/zenodo.16835383

 

Nanoscale Microbial Interactions In Soil-Water Systems: A New Paradigm

Authors: Arun Kumar Patidar, Bhavana Chauhan

Abstract: Microbial life in soil-water systems operates at a scale far more intricate than previously understood. With the emergence of nanoscale imaging and molecular tools, researchers have begun to uncover a new paradigm in microbial ecology—one where microbial interactions, community behavior, and environmental feedbacks occur at the nanometer level. These interactions encompass molecular exchanges, quorum sensing, and nanostructure-based adhesion mechanisms that shape the functionality and resilience of soil ecosystems. At these scales, microbial dynamics dictate nutrient flux, pollutant transformation, and plant-microbe symbiosis in ways not observable through conventional microbiological techniques. This article provides a comprehensive exploration of these nanoscale phenomena, examining how environmental pressures and nanoscale physical forces drive microbial behavior. The implications for sustainable land use, biogeochemical cycling, and soil rehabilitation are profound, as understanding microbial processes at this resolution can lead to breakthroughs in bioremediation, precision agriculture, and climate-resilient farming. The review also presents advances in methodologies such as atomic force microscopy, nanoSIMS, and cryo-electron tomography that have facilitated the visualization and quantification of microbial interactions at the nanoscale. Overall, this paradigm shift emphasizes the importance of considering nanoscale microbial interactions as fundamental units in soil-water system functioning.

DOI: https://doi.org/10.5281/zenodo.16835268

 

Comparative Genomics Of Microbial Populations In Agroecosystems

Authors: Harish Kumar Rathore, Monika Gupta

Abstract: The microbial communities inhabiting agroecosystems play critical roles in soil health, nutrient cycling, and crop productivity. With advancements in high-throughput sequencing technologies, comparative genomics has emerged as a powerful tool to analyze the diversity and functional capabilities of these microbial populations. This study explores how comparative genomics can illuminate the evolutionary relationships, functional gene repertoire, and adaptive traits among microbial taxa in various agricultural environments. By analyzing metagenomic datasets from different soil types and farming practices, we identify patterns of gene distribution related to nitrogen fixation, phosphorus solubilization, and pathogen resistance. The study also examines how horizontal gene transfer contributes to microbial resilience in disturbed agroecosystems. Insights from comparative genomic studies enhance our understanding of the impact of agricultural practices—such as crop rotation, fertilization, and pesticide use—on microbial diversity and ecosystem function. Case studies from organic and conventional farms reveal significant differences in microbial gene expression and evolutionary adaptation. This article underscores the importance of integrating genomic data into sustainable agriculture strategies and offers future directions for using microbial genomics in crop management and soil restoration efforts.

DOI:

 

 

Microbial Biosensors: Genetic Tools For Monitoring Soil Health

Authors: Pradeep Kumar Netam, Meena Porte

Abstract: Soil health is an integral determinant of agricultural productivity, ecosystem balance, and environmental sustainability. Microbial biosensors, leveraging genetically engineered microbial strains, offer a novel approach to real-time, in situ monitoring of soil contaminants and nutrient dynamics. These biosensors are designed to detect specific chemical signals—ranging from heavy metals and pesticides to changes in pH and nitrogen content—by producing measurable outputs such as fluorescence, bioluminescence, or electrochemical signals. This article reviews the development and deployment of microbial biosensors as tools for assessing soil health. It explores their underlying biological principles, integration into environmental monitoring frameworks, and potential to overcome the limitations of conventional soil assessment techniques. The paper emphasizes the importance of synthetic biology and CRISPR-based modulation in enhancing biosensor specificity and stability. Furthermore, it highlights successful case studies from agriculture, bioremediation, and land reclamation projects. Finally, the article discusses current challenges—such as environmental variability and regulatory hurdles—and future directions, including field-deployable biosensor platforms and wireless data integration. The findings underscore microbial biosensors’ transformative potential in advancing precision agriculture and soil restoration practices through continuous and targeted ecological surveillance.

DOI: https://doi.org/10.5281/zenodo.16834975

 

Nano-Enabled Microbial Bioreactors for Sustainable Water Purification

Authors: Ajay Kumar Dash, Ipsita Pradhan

Abstract: Nano-enabled microbial bioreactors are emerging as an innovative approach for sustainable water purification, combining the catalytic versatility of microbes with the high surface area, reactivity, and functional properties of nanomaterials. These hybrid systems are designed to enhance the degradation, adsorption, and transformation of organic pollutants, heavy metals, and pathogens in contaminated water sources. Nanoparticles act as catalysts, redox mediators, or structural supports, accelerating microbial metabolic processes and facilitating electron transfer in bioreactors. This synergistic relationship significantly improves pollutant removal efficiency, reduces treatment time, and enhances system stability. As global freshwater resources face escalating pollution and scarcity, nano-enabled microbial bioreactors offer a scalable and eco-friendly solution that bridges the gap between advanced nanotechnology and traditional biological wastewater treatment. This article explores their working principles, applications, environmental benefits, and future prospects

DOI: https://doi.org/10.5281/zenodo.16842310

 

Integrating Nanobio Interfaces For Real-Time Environmental Monitoring

Authors: Manoj Kumar Pradhan, Anjali Swain

Abstract: The integration of nanotechnology and biological sensing elements has paved the way for advanced nanobio interfaces that can revolutionize environmental monitoring. These hybrid systems offer real-time, highly sensitive detection capabilities for a wide range of environmental pollutants, including heavy metals, organic contaminants, and microbial toxins. By combining the specificity of biological recognition elements with the signal-enhancing properties of nanomaterials, nanobio interfaces provide a dynamic solution for continuous and in-situ environmental diagnostics. This paper explores the design, mechanisms, applications, and challenges associated with deploying nanobio interfaces in environmental settings, and outlines their potential as scalable, adaptable, and cost-effective monitoring tools.

DOI: https://doi.org/10.5281/zenodo.16842578

 

Functional Diversity Of Microbial Enzymes In Acidic Mine Drainage Sites

Authors: Akhilesh Kumar Mandal, Savita Patra

Abstract: Acidic mine drainage (AMD) environments are characterized by extreme acidity and elevated concentrations of heavy metals, making them inhospitable to most life forms. Yet, microbial life thrives in these ecosystems through unique metabolic adaptations, particularly enzyme systems that function under such harsh conditions. This study explores the functional diversity of microbial enzymes in AMD sites, with a focus on their ecological roles, biogeochemical contributions, and potential applications in bioremediation. Through metagenomic analyses, microbial communities are examined for genes encoding enzymes involved in sulfur and iron oxidation, heavy metal resistance, and acid tolerance. The findings reveal a complex microbial network dominated by acidophilic chemolithoautotrophs such as Acidithiobacillus ferrooxidans and Leptospirillum ferrooxidans, which orchestrate critical oxidation-reduction processes. The presence of specialized enzymes like rusticyanin, cytochrome c oxidase, and ATPases adapted for low pH indicates functional specialization. Furthermore, these enzymes facilitate biogeochemical cycling and influence AMD chemistry, contributing to both environmental degradation and potential restoration when harnessed correctly. This study underscores the value of microbial enzyme diversity in understanding AMD ecology and leveraging it for sustainable environmental cleanup strategies.

DOI: http://doi.org/10.5281/zenodo.16870814

Metatranscriptomic Profiling Of Microbial Stress Responses To Soil Contaminants

Authors: Lalit Kumar Sen, Madhvi Chourasiya

Abstract: Soil contamination by heavy metals, pesticides, hydrocarbons, and industrial pollutants disrupts microbial ecology, affecting soil health and plant productivity. Metatranscriptomics, the large-scale sequencing of environmental RNA, offers an advanced approach to decipher real-time microbial responses to such stressors. This study investigates the functional gene expression profiles of soil microbiomes under contaminant stress using metatranscriptomic analysis. By examining transcripts linked to oxidative stress, metal resistance, and pollutant degradation, we identify key microbial pathways that mediate adaptation and survival. Our findings highlight the upregulation of genes involved in efflux pumps, antioxidative enzymes like catalases and peroxidases, and biodegradative enzymes including monooxygenases and dioxygenases. Community-level expression patterns reveal taxonomic shifts favoring resilient genera such as Pseudomonas, Acinetobacter, and Rhodococcus. The data suggest that contaminated environments exert strong selective pressures, driving microbial communities toward functional redundancy and niche specialization. This study underscores the potential of metatranscriptomics as a tool to monitor ecological risk, assess bioremediation capacity, and develop precision strategies for soil restoration. Our work provides a foundational framework for future research aiming to optimize microbial functions for environmental detoxification and sustainable land management.

DOI: http://doi.org/10.5281/zenodo.16870960

CRISPR Applications In Studying Microbial Resistance In Contaminated Ecosystems

Authors: Raghavendra Kumar, Smita Tiwari

Abstract: Microbial communities inhabiting contaminated ecosystems often develop complex resistance mechanisms to survive toxic environmental stressors. Understanding the molecular basis of this resistance is essential for ecological risk assessment and the development of bioremediation strategies. CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) technology, originally discovered as an adaptive immune system in bacteria and archaea, has emerged as a transformative tool for functional genomics and microbial ecology. This study explores how CRISPR-based approaches can elucidate microbial resistance mechanisms in polluted habitats, including heavy metal-rich soils, industrial effluents, and pesticide-contaminated farmlands. Using CRISPR interference (CRISPRi) and activation (CRISPRa), researchers can selectively knock down or upregulate microbial genes linked to metal ion transport, oxidative stress response, and efflux pump regulation. Metagenome-assembled genomes (MAGs) in tandem with CRISPR screens provide a robust framework to map resistance pathways at the community level. This article presents an overview of current CRISPR applications in microbial resistance research, evaluates their ecological implications, and highlights their potential to inform biotechnological interventions for ecosystem restoration. By integrating gene-editing precision with metagenomic profiling, CRISPR tools open new avenues to monitor, model, and modulate microbial responses to contamination.

DOI: http://doi.org/10.5281/zenodo.16871016

Metagenomic Analysis Of Microbial Communities In E-Waste Bioreactors

Authors: Vivek Kumar Ghosh, Kusum Singh

Abstract: Electronic waste (e-waste) bioremediation has emerged as a sustainable approach to manage the growing burden of discarded electronics. This study investigates microbial communities in e-waste bioreactors using metagenomic techniques to identify key species and functional pathways involved in metal recovery and detoxification. By deploying next-generation sequencing (NGS) and shotgun metagenomic approaches, we uncovered taxonomic diversity and biochemical functions encoded in the resident microbiota. Our results revealed a predominance of metal-resistant bacteria, including Pseudomonas, Cupriavidus, and Desulfovibrio species, which possess genes for metal reduction, transport, and biofilm formation. Functional annotation indicated the prevalence of resistance-nodulation-division (RND) transporters, metallothioneins, and oxidoreductases crucial for heavy metal sequestration. This study underscores the utility of metagenomics in unraveling complex microbial interactions and their adaptive strategies in hostile e-waste environments. Insights from this research can facilitate the engineering of microbial consortia tailored for enhanced metal recovery and minimal ecological impact. The findings also establish a foundational knowledge base for bioaugmentation practices in electronic waste treatment systems. Ultimately, the integration of omics-based techniques into environmental biotechnology can accelerate the development of efficient and eco-friendly waste valorization platforms.

DOI: http://doi.org/10.5281/zenodo.16871064

Biosurfactant-Producing Microbes In Environmental Cleaning Applications

Authors: Pankaj Kumar Yadav, Rashmi Dubey

Abstract: Biosurfactant-producing microbes have emerged as crucial agents in eco-friendly environmental remediation, particularly for cleaning up oil spills, heavy metal contaminants, and industrial pollutants. These naturally derived surface-active compounds, produced by bacteria such as Pseudomonas aeruginosa, Bacillus subtilis, and Rhodococcus erythropolis, exhibit high emulsifying activity, low toxicity, and exceptional biodegradability. The focus of this research is to evaluate how microbial biosurfactants contribute to environmental cleaning through mechanisms of emulsification, desorption, and biostimulation. Emphasis is placed on their structural diversity, metabolic pathways, and potential applications in oil spill mitigation, soil washing, and heavy metal recovery. Through a review of current studies, laboratory findings, and emerging field applications, this article investigates the comparative performance of biosurfactants against synthetic surfactants. It also explores genetic and process engineering strategies to enhance biosurfactant yields. The results point toward biosurfactant-driven bioremediation as a promising frontier for sustainable environmental management. The article concludes with future research directions, highlighting bioreactor scalability and regulatory considerations necessary for large-scale deployment. These insights underscore the transformative role of biosurfactant-producing microbes in redefining the future of green technology and environmental restoration.

DOI: http://doi.org/10.5281/zenodo.16871248

Assessing The Impact Of COVID-19 On Renewable Energy Project Finance In The US: Challenges And Opportunities

Authors: Funmilayo Fenwa

Abstract: The COVID-19 pandemic represents an unprecedented global crisis that has fundamentally altered economic landscapes across all sectors, with particular significance for renewable energy project finance in the United States. This study examines the multifaceted impacts of the pandemic on renewable energy investment patterns, policy responses, and market dynamics during 2020. Through comprehensive analysis of industry data, policy documents, and market indicators, this research reveals a complex narrative of resilience and vulnerability within the renewable energy finance sector. While overall renewable capacity additions nearly doubled in the first half of 2020, driven primarily by tax credit deadline pressures, total renewable energy investment declined by 20% to $49.3 billion. The pandemic exposed critical dependencies on supply chains, policy incentives, and financing mechanisms while simultaneously demonstrating the sector's inherent stability advantages. This analysis contributes to understanding crisis resilience in clean energy markets and provides insights for policy development in future emergency scenarios.

DOI: http://doi.org/10.5281/zenodo.16994016

Implementing Disaster Recovery With Commvault And TSM While Maintaining CRM Continuity Across Salesforce Experience Cloud

Authors: Deepika Sirohi

Abstract: In modern enterprise ecosystems, maintaining continuous operations of Customer Relationship Management (CRM) platforms like Salesforce Experience Cloud is critical for revenue, customer engagement, and regulatory compliance. Disaster recovery (DR) strategies are essential to ensure uninterrupted CRM services across hybrid IT environments, encompassing UNIX, Linux, Windows, and cloud platforms. This review examines the implementation of DR frameworks using Commvault and IBM TSM (Spectrum Protect), highlighting their capabilities, integration approaches, and complementary strengths. Commvault offers hybrid cloud replication, orchestration, and automated failover, while TSM provides efficient incremental backups, long-term retention, and reliable on-premises support. By combining these solutions in a hybrid DR model, enterprises can achieve defined Recovery Time Objectives (RTO) and Recovery Point Objectives (RPO) for Salesforce workloads. The review further explores risk assessment, backup strategies, multi-platform integration, DR testing, monitoring, and continuous improvement processes, emphasizing practical approaches for preserving CRM continuity. Additionally, emerging trends such as AI-driven automation and cloud-native DR strategies are discussed, illustrating how predictive and adaptive technologies can enhance operational resilience. This comprehensive analysis provides IT architects, administrators, and decision-makers with actionable insights to design, implement, and optimize disaster recovery frameworks that safeguard Salesforce Experience Cloud operations while maintaining business continuity, data integrity, and compliance.

DOI: https://doi.org/10.5281/zenodo.17520456

 

From Bare-Metal Servers To Einstein Copilot: Bridging Legacy Unix Systems With AI-Powered CRM Transformation

Authors: Kamlesh Jangra

Abstract: The convergence of legacy Unix systems with AI-powered Customer Relationship Management (CRM) platforms, such as Salesforce Einstein Copilot, represents a critical strategy for modern enterprises seeking operational continuity and enhanced customer engagement. Legacy Unix servers, including AIX, Solaris, and older Linux distributions, continue to support mission-critical CRM workloads, storing historical data and managing transactional processes with high reliability. At the same time, AI-driven CRM introduces predictive analytics, automated workflows, and intelligent insights that enable personalized customer interactions and strategic decision-making. This review examines methodologies, architectures, and best practices for bridging legacy Unix infrastructure with AI-enhanced CRM, highlighting middleware solutions, API frameworks, hybrid deployment models, and automated data pipelines. It explores challenges related to data compatibility, infrastructure limitations, security, compliance, and organizational change management. Case studies from financial and healthcare sectors illustrate practical implementations and lessons learned, emphasizing phased migration, continuous monitoring, and performance optimization. By synthesizing technical strategies and industry examples, this review provides actionable guidance for IT architects, administrators, and decision-makers to modernize CRM operations, maintain data integrity, and ensure seamless integration of AI capabilities while leveraging the reliability of existing Unix environments. The discussion concludes with insights on emerging trends, including predictive analytics enhancements, cloud-first strategies, edge computing, and automation-driven Unix modernization, framing a future-ready approach to enterprise AI CRM adoption.

DOI: https://doi.org/10.5281/zenodo.17520771

 

Harnessing AI Dashboards in Oracle Cloud HCM: Advancing Predictive Workforce Intelligence and Managerial Agility

Authors: Kranthi Kumar Routhu

Abstract: The digital transformation of Human Resource Management (HRM) has entered a new phase with the convergence of Artificial Intelligence (AI), advanced analytics, and cloud-based Human Capital Management (HCM) systems. This evolution reflects a global shift from administrative HR operations to data-driven workforce intelligence. Among the leading solutions, Oracle Cloud HCM stands out for its integration of AI-powered analytics, predictive modeling, and configurable dashboards that deliver actionable insights to managers across all levels of the organization. By embedding analytics within HR workflows, Oracle’s HCM platform enables enterprises to automate decision processes, identify workforce trends, and enhance compliance through real-time monitoring and intelligent recommendations. AI-driven dashboards transform traditional HR reporting into a dynamic, interactive decision-support environment, where key performance indicators (KPIs) are continuously analyzed to reveal emerging risks, opportunities, and performance gaps. These systems not only consolidate complex datasets from payroll, recruitment, and performance management modules but also apply machine learning algorithms to predict employee attrition, engagement levels, and talent acquisition efficiency. This paper explores the evolution, architecture, and strategic importance of AI-augmented dashboards in Oracle Cloud HCM, emphasizing their role in enhancing managerial decision-making. It develops a conceptual framework for AI-enabled decision support, detailing how predictive analytics and visualization work together to improve accuracy, transparency, and responsiveness in HR operations. Furthermore, the study discusses practical implementation challenges including data quality, explainability, and user adoption and evaluates the tangible benefits of integrating AI-driven dashboards within enterprise HCM systems. The findings highlight how Oracle Cloud HCM serves as a model for intelligent HR transformation, aligning technology, analytics, and human expertise to support sustainable organizational growth.

DOI: http://doi.org/10.5281/zenodo.17670797

Invisible Risks In Connected Worlds An IT Risk Management Framework For Cloud Enabled IoT Systems

Authors: Sasikanth Reddy Mandati

Abstract: The rapid proliferation of cloud-enabled Internet of Things (IoT) systems has transformed modern infrastructure, enabling real-time data collection, intelligent automation, and interconnected services across domains such as smart cities, healthcare, and industrial IoT. However, the increasing complexity and interdependence of these systems also expose them to invisible risks latent, cascading, and systemic vulnerabilities that are often overlooked by conventional risk management approaches. This paper presents a comprehensive IT risk management framework designed to detect, assess, and mitigate such hidden risks in cloud-enabled IoT environments. The framework integrates real-time data analytics, anomaly detection, and proactive mitigation strategies to improve system resilience and operational reliability. A case study in a smart city scenario demonstrates the framework’s effectiveness, showing significant improvements in risk coverage, mitigation efficiency, and stakeholder confidence compared to baseline methods. The results highlight the critical importance of addressing invisible risks to ensure secure, reliable, and resilient IoT-enabled systems.

DOI: https://doi.org/10.5281/zenodo.17999954

An Analytical Study Of Multi-Cloud Strategies For Enhancing Scalability, Reliability, And Data Security

Authors: Anvi Saxena

Abstract: The rapid growth of cloud computing has transformed the way organizations deploy, manage, and scale their IT infrastructure. Traditional single-cloud deployments often face limitations such as vendor lock-in, scalability bottlenecks, reliability issues, and security vulnerabilities. To address these challenges, multi-cloud strategies have emerged as a viable solution, enabling organizations to leverage multiple cloud service providers simultaneously. This review article presents an analytical study of multi-cloud strategies, emphasizing their impact on scalability, reliability, and data security. Scalability is a critical requirement in modern IT ecosystems, allowing dynamic resource allocation based on workload demands. Multi-cloud strategies enhance scalability by distributing workloads across several providers, enabling organizations to optimize performance and reduce latency. Reliability, or the ability of a system to maintain continuous service despite failures, is also improved in multi-cloud environments. By implementing redundancy and failover mechanisms across multiple clouds, organizations can achieve high availability and disaster recovery capabilities that are difficult with single-cloud architectures. Data security is another crucial consideration, as storing sensitive information across multiple platforms introduces potential vulnerabilities. Multi-cloud strategies can mitigate security risks through encryption, identity and access management, compliance adherence, and robust monitoring practices. This review systematically examines recent literature and case studies, highlighting different multi-cloud approaches, their benefits, and associated challenges. Additionally, it identifies gaps in current research, particularly in areas such as interoperability, orchestration, and automated management. The article also explores emerging trends, including AI-assisted cloud management, edge computing integration, and serverless architectures, which can further enhance multi-cloud effectiveness. Ultimately, this review provides a holistic understanding of how multi-cloud strategies contribute to improved scalability, reliability, and data security, offering valuable insights for researchers, IT architects, and organizational decision-makers aiming to optimize cloud infrastructure for the evolving digital landscape.

DOI: http://doi.org/10.5281/zenodo.18159410

Integrating AI And Machine Learning Into SAP HANA For High-Velocity Healthcare And Financial Data Analytics

Authors: Rudra Narayan

Abstract: The exponential growth of data in healthcare and financial sectors presents unique challenges in storage, processing, and real-time analytics. High-velocity data streams—originating from electronic health records (EHRs), IoT medical devices, stock trading systems, and payment networks require sophisticated frameworks capable of handling large volumes with minimal latency. SAP HANA, an in-memory, columnar database platform, offers real-time processing capabilities that allow organizations to integrate advanced analytics and machine learning (ML) directly into transactional and operational data environments. By leveraging AI and ML, healthcare institutions can predict patient outcomes, optimize treatment plans, and enhance diagnostic accuracy, while financial organizations can detect fraud, assess risk, and execute high-frequency trading strategies efficiently. This review article explores the convergence of AI/ML techniques with SAP HANA for high-velocity data analytics, emphasizing both technical implementation and domain-specific applications. We provide an overview of SAP HANA’s architecture, predictive analytics libraries, and integration approaches with external ML frameworks such as Python, R, TensorFlow, and PyTorch. The article also examines real-time data pipelines, model deployment strategies, and key challenges, including data privacy, scalability, and model interpretability. Case studies in healthcare demonstrate predictive modeling for patient management, disease diagnosis, and imaging analytics, while financial applications highlight fraud detection, real-time risk assessment, and market analytics. Furthermore, the review discusses benefits such as reduced latency, improved decision-making, and operational efficiency, alongside limitations that include heterogeneous data integration, regulatory compliance, and model transparency. Finally, future research directions are outlined, including deep learning integration, edge computing for real-time analytics, hybrid cloud deployments, and explainable AI methodologies. This review serves as a comprehensive resource for researchers, practitioners, and decision-makers seeking to understand the potential of AI and ML integration within SAP HANA for processing and analyzing high-velocity healthcare and financial data efficiently and effectively.

DOI: http://doi.org/10.5281/zenodo.18159474

Risk-Aware Cloud Computing Frameworks For Secure IoT Communication Over Wireless Network Infrastructures

Authors: Prisha Malviya

Abstract: The rapid proliferation of Internet of Things (IoT) devices, coupled with the high-performance analytical capabilities of Cloud Computing, has created an interdependent ecosystem that relies heavily on wireless network infrastructures. However, this integration introduces significant security vulnerabilities, as the broadcast nature of wireless communication leaves data susceptible to jamming, eavesdropping, and sophisticated man-in-the-middle attacks. This review article systematically investigates the current landscape of Risk-Aware Cloud Computing Frameworks designed to secure IoT communications. We propose a multi-dimensional taxonomy that categorizes these frameworks based on their architectural distribution (Cloud-to-Edge), their risk-assessment methodologies (Probabilistic vs. AI-driven), and their decision-making logic (Reactive vs. Proactive). The article provides a deep dive into the "Resource-Security Paradox," analyzing how risk-aware models optimize the trade-off between cryptographic overhead and device longevity. Furthermore, we provide a comparative analysis of state-of-the-art frameworks, evaluating them against key performance metrics such as detection accuracy, latency, and energy efficiency. Significant attention is given to the role of Software-Defined Networking (SDN) and Trust Management Systems in providing real-time mitigation of wireless threats. Finally, the article identifies critical research gaps and discusses emerging trends, including Zero Trust Architectures (ZTA), Quantum-Resistant Cryptography, and the impact of 6G on IoT security. This review aims to provide a comprehensive reference for researchers and practitioners working to build resilient, self-adaptive security infrastructures for the future of the interconnected world.

DOI: http://doi.org/10.5281/zenodo.18159497

Continuous Integration and Continuous Deployment Tools of Enterprise Practices

Authors: Vinod Kumar Jangala

Abstract: Continuous Integration (CI) and Continuous Deployment (CD) have become essential practices in enterprise software engineering, enabling organizations to deliver high-quality software at an accelerated pace while maintaining reliability and scalability. CI focuses on the frequent integration of code changes into shared repositories with automated builds and testing, whereas CD extends this process by automating application deployment across environments, including production. Together, CI/CD pipelines support DevOps principles by fostering collaboration among development, operations, and quality assurance teams, reducing manual intervention, and enabling rapid feedback loops. This paper presents a comprehensive review of CI/CD tools and enterprise practices, examining how organizations adopt and operationalize these technologies to address the growing complexity of modern software systems. It analyzes widely used CI tools such as Jenkins, GitLab CI, TeamCity, Bamboo, and Travis CI, alongside CD and delivery platforms including Spinnaker, Argo CD, Harness, and GitOps-based frameworks. The review highlights key enterprise adoption practices, performance metrics, and comparative tool capabilities, with particular attention to scalability, security, compliance, and integration with cloud-native technologies such as containers, Kubernetes, and infrastructure-as-code. Challenges related to heterogeneous toolchains, cultural transformation, pipeline performance, and regulatory requirements are critically discussed. Furthermore, the paper explores emerging trends shaping the future of CI/CD, including AI-driven pipeline optimization, DevSecOps, GitOps, multi-cloud orchestration, and edge deployments. By synthesizing existing literature and industry practices, this work provides actionable insights for software engineers, DevOps practitioners, and IT managers, while identifying research gaps and future directions to advance reliable, efficient, and secure enterprise-scale CI/CD implementations.

DOI: https://doi.org/10.5281/zenodo.18464806

 

Hybrid Knowledge Graph And Vector Similarity Architectures For End-to-End Financial Transaction Journey Analysis

Authors: Ramani Teegala

Abstract: By December 2021, financial institutions were operating transaction platforms whose end to end behavior increasingly resembled distributed journeys rather than single system events. A single customer initiated action, such as a card purchase, an account to account transfer, or a cross border remittance, could traverse channels, risk engines, limits services, payment rails, settlement systems, dispute workflows, and compliance controls across both internal and external counterparties. This fragmentation created persistent challenges in observability, auditability, and root cause analysis because the underlying data was split across event logs, relational ledgers, message queues, fraud features, and case management systems, each with different identifiers and retention policies. Knowledge graphs matured as a practical representation for integrating heterogeneous entities and relationships, enabling banks to model accounts, customers, devices, merchants, authorizations, postings, reversals, chargebacks, and compliance decisions as a coherent linked structure. In parallel, vector similarity search and embedding based retrieval became increasingly accessible due to open source libraries and emerging vector store implementations, providing a complementary mechanism for approximate matching over high dimensional representations of transactions, sequences, and behavioral signatures. This paper examines how knowledge graphs and vector stores can be combined to represent and analyze financial transaction journeys as understood and practicable by December 2021. The analysis frames the problem through regulated banking constraints, including PCI DSS requirements for cardholder data protection, GLBA expectations for safeguarding customer information, SOX oriented control evidence, Basel Committee guidance on operational risk, and FFIEC style expectations for resilient operations and audit readiness. The paper proposes a conceptual model in which a graph centric system of record captures identity resolution and explicit relationships, while a vector retrieval layer supports similarity based enrichment, anomaly surfacing, and candidate linking for incomplete or ambiguous journey traces. It evaluates architectural trade offs related to consistency, latency, governance, and explainability, emphasizing that approximate methods must be bounded by deterministic controls when outcomes influence fraud actions, customer impact, or regulatory reporting.

DOI: https://doi.org/10.5281/zenodo.19100103

Explainable AI For Cybersecurity Decision-Making

Authors: Farah Syazwani

Abstract: Explainable Artificial Intelligence (XAI) has emerged as a critical paradigm in enhancing trust, transparency, and accountability in cybersecurity systems. As cyber threats become increasingly sophisticated, traditional black-box machine learning models often fail to provide interpretable insights into their decision-making processes, thereby limiting their adoption in high-stakes environments. This review explores the integration of explainable AI techniques within cybersecurity frameworks, focusing on how interpretability improves threat detection, incident response, and risk assessment. The article highlights key methodologies such as feature attribution, model-agnostic explanations, and rule-based learning that enable analysts to understand and validate model outputs. Additionally, the role of XAI in regulatory compliance and ethical AI deployment is examined, emphasizing the need for transparency in automated decision systems. Challenges such as trade-offs between accuracy and interpretability, adversarial manipulation of explanations, and scalability issues are also discussed. Emerging trends, including hybrid explainability approaches and human-in-the-loop systems, are presented as promising directions for future research. By bridging the gap between complex machine learning models and human understanding, XAI holds significant potential to transform cybersecurity decision-making into a more reliable and interpretable process. This review provides a comprehensive overview of current advancements and outlines future pathways for integrating explainable intelligence into cybersecurity infrastructures.

DOI: https://doi.org/10.5281/zenodo.19492116



Intelligent SD-WAN Management Using AI

Authors: Siti Rahmawati

 

 

Abstract: The rapid proliferation of cloud-native applications, hybrid work models, and bandwidth-intensive services has fundamentally challenged the static nature of traditional Wide Area Networks (WAN). Software-Defined WAN (SD-WAN) introduced a centralized control plane to decouple network software from hardware, yet the manual definition of steering policies often fails to account for the highly volatile nature of internet transport circuits. This review examines the paradigm shift toward Intelligent SD-WAN Management powered by Artificial Intelligence (AI) and Machine Learning (ML). By leveraging deep learning architectures and reinforcement learning agents, SD-WAN controllers can now transition from reactive, threshold-based switching to proactive, intent-driven optimization. This article explores the core methodologies of AI-integrated management, focusing on predictive traffic engineering, automated root cause analysis, and self-healing infrastructure. We analyze how AI models optimize Quality of Experience (QoE) for mission-critical applications—such as VoIP and real-time video—by analyzing multi-dimensional telemetry including jitter, latency, and packet loss in real-time. Furthermore, the review addresses the critical challenges of model interpretability in network operations, the "cold start" problem in new deployments, and the necessity for federated learning to ensure data privacy across multi-tenant SD-WAN environments. By synthesizing recent academic breakthroughs and industrial implementations, this paper provides a strategic roadmap for building "Self-Driving WANs." The findings suggest that AI-integrated management not only reduces operational expenditure by automating complex routing decisions but also provides the cognitive intelligence required to manage the unpredictable performance of commodity internet underlays in a global digital economy.

DOI:

 

 

Published by:

Integrating Risk Management: Technology’s Role in Bridging FinTech and Traditional Banks

Uncategorized

Integrating Risk Management: Technology’s Role in Bridging FinTech and Traditional Banks
Authors:- Chintamani Bagwe

Abstract- This article overviews risk management strategies details in the financial industry landscape, comparing the practices of risk management between FinTech companies and traditional banking taking into account the regulatory landscape that embraces technological changes. Risk management has always been central to the financial sector, ensuring the stability and safety of all stakeholders involved. FinTech companies, enabling most traditional banking services through technology, introduce unique offerings that come with new risk types such as cybersecurity and regulatory risk that require widely different mitigation measures. Some of the risk’s responses include automation, flexible ecosystem formation, and creating customer-focused entities. Traditional banks, in contrast, have developed time-proven risk management schemas that focus on enterprise risk management to regulatory compliance to risk aversion. However, they face their own complexity with the immense burden of maintaining compliance and integrating existing technology with emerging tech. The collaboration with Fintech is a way out through the cross-role processes. Furthermore, the regulatory space will also see change regulations on FinTech firms and traditional banks that will have to adapt to continue operations. In the future, existing and new technologies will be integrated into the space, resulting in better risk management schemas. Successful partnerships between a bank and a traditional bank are shared at the end. The collaboration between FinTech companies and traditional banks will be essential to survive the current revolution.

DOI: 10.61137/ijsret.vol.10.issue5.793

Published by:

IJSRET Volume 7 Issue 5, Sept-Oct-2021

Uncategorized

Comparative Analysis of Balancing Methods for Classifying Imbalanced Data
Authors:- Himani Tiwari, Dr. Sheetal Rathi

Abstract- The classification of data with unbalanced class distribution encounters the significant shortcomings of the performance that most standard classification learning algorithms can achieve. These algorithms assume that the class distribution is relatively balanced and the cost of pre-classification is the same. This article reviews the classification of unbalanced data: areas of application; the nature of the problem; learning difficulties with standard classification learning algorithms; learning objectives and evaluation measures; reported research solutions and class imbalance problems when there are multiple classes.

Enhancement of Hybrid Power Generation System with VSC Based Power Compensation in Faulty Conditioning
Authors:- Shailendra lodhi, Chandra Shekhar Sharma

Abstract- In today’s technological world, electricity is one of the most important aspects of our daily life. Since we are all unaware of the fact that renewable energy sources are depleting as fast as lightning. It is therefore time for us to remove the common focus from unconventional energy sources to generate electricity. The output of electricity generated by non-standard sources is less than their counterparts. Renewable sources have no negative effects on the environment. The Solar-wind hybrid system is basically a combination of a solar plant and a wind power plant. It will help to provide uninterrupted electricity supply. As in bad weather the product can be moved from one plant to another with the help of a VSC power compensator. The VSC power compensator ensures efficient use of resources and increases the power Quality of the integrated system compared to each generation mode. It helps to reduce reliance on a single source and makes the system more reliable. The hybrid system can be used for both industrial and domestic applications.

Designing Of Power and Delay Efficient 10T and 14T SRAM Cell
Authors:- M.Tech. Scholar Sanjay Mongiya, Prof. Pratha Mishra, Prof. Sandip Nemade, Prof. Dr. Vikas Gupta

Abstract-This work presents an analysis of popular 1-bit full adder circuits. The analysis metrics comprised of power, delay, power-delay-product, area, and threshold loss. As an important unit of various hardware computational blocks, the transistor level design of the full adder circuit has been evolving for decades. In this comparative study, we focus on the highly cited designs of last two decades. This paper presents design of a new stable and 14T full power efficient adder circuit. The proposed circuit is designed based on Pass Transistor Logic (PTL) network using NMOS transistor only. The proposed circuit is simulated at layout level using LTSpice tools technology in terms of power and voltage level at the sum and carries nodes. The proposed circuit performance is compared with a similar 14T adder circuits and found the proposed adder circuit consumes lower power due to smaller load capacitance and parasitic resistance. The logic level at the sum and carry nodes maintains at strong 1 or strong 0 due to proposed circuit’s design architecture.This paper we introduced 10T one-bit full adders, and 14T including the most motivating of those are analyzed and compared for speed, leakage power, and leakage current. The analysis has been performed on various process and circuits techniques, the analysis with minimum transistor size to minimize leakage power, the latter with simulate transistor dimension to minimize leakage current. The simulation has been carried out on a LTSPICE tool using a .065nm technology. 10T adder and 14T adder.

Attribute-Based Temporary Keyword Search Scheme in Cloud Storage Server
Authors:- M. Tech. Scholar Sindhu Mathuku, Asst. Prof. V Dakshayani, Asst. Prof. V Subhasini

Abstract- Attribute-based keyword search (ABKS), as an important type of searchable encryption, has been widely utilized for secure cloud storage. In a key-policy attribute-based temporary keyword search (KP-ABTKS) scheme, a private key is associated with an access policy that controls the search ability of the user, while a search token is associated with a time interval that controls the search time of the cloud server. However, after a careful study, we uncover that the only existing KP-ABTKS construction [1] is not secure. Through two carefully designed attacks, we first show that the cloud server can search the cipher-text in any time. As a result, their scheme cannot support temporary keyword search. To address this problem, we present an enhanced KP-ABTKS scheme and prove that it is selectively secure against chosen-keyword attack in the random oracle model. The proposed scheme achieves both fine-grained search control and temporary keyword search simultaneously. In addition, the performance evaluation indicates that our scheme is practical.

Desing and Implement of IOT Based Four Way Womens Safety Device
Authors:- Asst. Prof. Dr. M. Dhinesh Kumar, A. Arunmozhi, L. Geetha, R. Sandhiya, S. Subalakshmi

Abstract-As we know the present era is with equal rights, where in both men and women are taking equal responsibility in their respective works. Hence women are giving equal competition next to men in all fields, they are assigned works in both the even and odd shift. Every single day women and young girls from all walks of life are being assaulted, molested, and raped. The streets, public transport, public spaces in particular have become the territory of the hunters’. Because of these reasons women can’t step out of their house. We propose to have a device which is the integration of multiple devices, hardware comprises of a wearable “Smart gadget” which continuously communicates with Smart phone that has access to the internet. The complete gadget also ensures to provide self-defense application which helps her to escape critical situations. This system can be used at places like bus stops, railway stations, offices, footpaths, shopping malls, markets, etc. The implementation of the smart gadget is basically split into two sections the first part ensures to capture the image of the Culprit the device get automatically triggered when there is a suspected motion in front of the camera, the device captures the image of the culprit and send it as an attachment to the concerned E-mail Id along with the location of the Victim. The captured image serves as the valid proof against the one who has committed the crime. By making self- defence as the first priority we make sure that occurrence of the critical situations are eliminated. The self-defence feature is capable of working in any of the circumstances either it may be with Internet as a Smart Pendant with LED flash that makes an alert call to the family, relatives via the cloud and also glows the led flash on the eyes of the culprit to make the vision blur when the attacker is at the shorter distance. Whereas Self- defence without Internet consists of Electric shock gloves, that is used to provide the electric shocks that diverts the mind of the culprit and reduce his excited state to commit the crime on women. These two factors form the combined self-defence application and help the victim to escape from the danger.

Four-Switch Three-Phase Inverter-Fed Im Drives-Literature Review
Authors:- M.Tech. Scholar Yatin Kumar, , Dr. Shweta Chourasia, Dr. E.Vijay Kumar

Abstract- Three phase induction motors have been considered one of the most commonly used electric machines in industrial applications due to their low cost, simple and robust construction. Three-phase inverters are considered an essential part in the variable speed AC motor drives. The new speed estimation adaptation law, which ensures estimation stability and fast error dynamics, is derived based on Lyapunov theory. Furthermore, a Fuzzy Logic Controller (FLC) is present as another nonlinear optimizer to minimize the speed tuning signal used for the rotor speed estimation. This paper provides a detailed survey of the past work in the inverter field. The theoretical and experimental works from different types of DC/AC or AC/DC inverter techniques are discussed.

Performance Analysis of Bidirectional Grid-Connected Single-Power-Conversion
Authors:- Pankaj Madheshiy, Dr. Shweta Chourasia, Dr. E.Vijay Kumar

Abstract- Power converter configuration targets improving the effectiveness. Yet, in a first approach and to characterize fundamental topologies, it is fascinating to expect that no misfortune happens in the converter procedure of an influence converter. With this theory, the fundamental components are of two kinds: – non-direct components, for the most part electronic switches: semiconductors utilized in substitution mode; – straight responsive components: capacitors, inductances and common inductances or transformers. These responsive parts are utilized for middle of the road vitality stockpiling for voltage and current shifting. They by and large speak to a significant piece of the size, weight, and cost of the hardware. This starting work audits and gives an exact meaning of fundamental ideas basic for the comprehension and the structure of converter topologies. Above all the sources and the switches are characterized. At that point, the key association controls between these fundamental components are checked on. From that point, converter topologies are determined. A few instances of topology combination are given. At last, the idea of hard and delicate compensation is presented. Simulation is done using MATLAB simulink software.

Performance and Selection of Thermoelectric Module for Given Temperature Range
Authors:- Prajyot Chavan, Azeem Peerjade, Shahid Jamadar, Sushant Sutar, Asst. Prof. Dipak Patil

Abstract- Experimental investigations on several commercially available TEC and TEG are conducted in industries to evaluate performance trends. Experimental setups are analyzed and the parameters determining the performance and working of thermoelectric modules. It is found that how we use thermoelectric modules with different application in industrial sector with different works. Finally, the thermoelectric module has much more applications and in the paper work also shows analysis for easily used in engineering sector.

A Review on Design and Analysis of Ladder Chassis
Authors:- M.Tech. Scholar Shubham Agrawal, Prof. Arun Patel

Abstract- One of the major challenges is of designing of the chassis. Design of chassis is begins with analysis of load cases. There are four loads acting on chassis to be considered. These loads are important considerations in design of chassis because of ride safety and comfort of passengers. The magnitude of stress arises from these loads can be used to predict the performance of chassis. Automotive chassis is made of a steel frame, aluminum or composite. In this study past literature has been done.

Survey on Privacy Preserving Mining Techniques And Application
Authors:- Phd Scholar Jayshree Boaddh, Dr.Shailja Sharma,

Abstract- Digital platform increase the easiness of data organization and utility. Extraction of information from raw data was performed by data mining algorithms. This information has many applications but few of miners extract knowledge which might affect the privacy of individual, organization, community, etc. So this paper focuses on finding the techniques which provide privacy of data against data mining algorithms. Paper has performed a survey on recent methodology proposed by different researcher. Some of data mining methods were also describe in the paper which help in information extraction. Evaluation parameters were detailed for comparison of privacy preserving methods.

A Review On Thermal Performance Optimization And Cfd Analysis Of Double Pipe Heat Exchanger
Authors:- M.Tech Scholar Rahul Sahu, Assistant Professor N.V. Saxena

Abstract- One of the most simple and applicable heat exchangers is double pipe heat exchanger (DPHE). This kind of heat exchanger is widely used in chemical, food, oil and gas industries. Upon having a relatively small diameter, many precise researches have also hold firmly the belief that this type of heat exchanger is used in high-pressure applications. They are also of great importance where a wide range of temperature is needed. It is also well documented that this kind of heat exchanger makes a significant contribution to pasteurizing, reheating, preheating, digester heating and effluent heating processes. Many of small industries also use DPHEs due to their low cost of design and maintenance. As a result, we came to conclusion that the previous researches carried out on this type of heat exchanger should be categorized in order to overcome the perplexities of choosing the most appropriate methods of interest.

Improvement Of Statcom With Grid Connected Flicker Minimization And Power Quality Improvement
Authors:- Deepesh Patel, Asst. Prof. Shivendra Singh Thakur

Abstract- The injection of the PV Grid power into an electric grid affects the power quality. The influence of the PV Grid in the grid system concerning the power quality measurements and the norms followed according to the guidelines specified in the International Electro technical Commission standard, are the active and reactive power variations, variation of voltages, flicker, harmonics and electrical behavior of switching operations. The work study demonstrates has overall good functional characteristics, better performance and faster response than existing systems. The proposed system of having STATCOM is smaller in size and less costly when compared to the existing system. In this proposed system static compensator (STATCOM) is connected at a point with a battery energy storage system to reduce the power quality issues. The effectiveness of the proposed scheme gives the reactive power demand of load and the induction generator. Simulation is done by using MATLAB / SIMULINK-Sim power system software.

Enhancement Of Hybrid Power Generation System With Vsc Based Power Compensation In Faulty Conditioning
Authors:- Shailendra Lodhi, Asst.Prof. Chandra Shekhar Sharma

Abstract- In today’s technological world, electricity is one of the most important aspects of our daily life. Since we are all unaware of the fact that renewable energy sources are depleting as fast as lightning. It is therefore time for us to remove the common focus from unconventional energy sources to generate electricity. The output of electricity generated by non-standard sources is less than their counterparts. Renewable sources have no negative effects on the environment. The Solar-wind hybrid system is basically a combination of a solar plant and a wind power plant. It will help to provide uninterrupted electricity supply. As in bad weather the product can be moved from one plant to another with the help of a VSC power compensator. The VSC power compensator ensures efficient use of resources and increases the power Quality of the integrated system compared to each generation mode. It helps to reduce reliance on a single source and makes the system more reliable. The hybrid system can be used for both industrial and domestic applications.

A Review On Hybrid Energy Based On Mppt Techniques
Authors:- M.Tech. Scholar Anshu Bala, Professor Vinay Pathak

Abstract-This Paper provides a succinct and well-organized overview of different maximum power point tracking (MPPT) algorithms used in photovoltaic (PV) generating systems that may operate in partial shade. To far, a broad range of algorithms, PV modelling methods, PV array designs, and controller topologies have been investigated. However, every method has both benefits and drawbacks; as a consequence, while building a PV generating system (PGS) under partial shade conditions, a thorough literature study is required. The thorough review of MPPT algorithms has been done in this article. The review of MPPT methods has been divided into four major categories. The first group consists of entirely new MPPT optimization algorithms, the second group consists of hybrid MPPT algorithms, the third group consists of novel modelling approaches, and the fourth group consists of different converter topologies. This article offers an accessible reference for doing large-scale research in PV systems under partial shadowing conditions in the near future.

Thermal Analysis of Heat Sink with Perforation Techniques Using Ansys
Authors:-M.Tech.Scholar Umesh Badode, Asst.Prof. Deepak Solanki

Abstract-The engine chamber is one of the essential engine components that is subjected to high temperatures and heat stress. Particles on the cylinder surface enhance convection heat exchange. The heat produced by gasoline burning inside a vehicle engine. The friction between moving components often generates more heat. The air-cooled I.C. engine has fins in the shape of expanded surfaces surrounding the motor cylinders to improve heat transfer. Fin analysis is an important endeavour in order to increase the heat transfer rate. This study looks at past work on fine heat transfer rate enhancements, looking at changes in the form and composition of cylinder fins. The ANSYS programme was utilised in this study to examine the impact of fin shape and size on heat exchange within various fin geometries, including pin fin morphologies, tube fin geometries, hole geometries, and plate fin geometries. Temperature changes in fins have been investigated utilizing experiments. One of the studies was to assess temperature changes in exact field performance models and compare them to experimental data in Ansys. We’re looking at methods to make the most of the wind to help with heat dissipation. The study’s goal is to improve thermal properties via modifications in form, material, and small-scale design.

Image Processing: An Application of Machine Learning
Authors:- Duggineni Srinivasa Rao

Abstract- In the current scenario of the data world, the data holds significant information if processed correctly. The data can be in the form of images which can prove to be a boon in deriving the useful insights from it in order to get the knowledge of things at an early stage itself. But the matter of concern is deriving the information from the images will be a tedious task for human beings and would incur a heavy cost and time. So, an easy and cheaper technique is to teach a machine efficiently to do the task for us. The concept of using Machines to do human tasks is known as Machine Learning. In this paper, I present various literature reviews regarding image processing in Machine learning and how image processing has helped in identifying the issues at early stages so that they can be resolved easily without causing much harm. Also, image processing has been a helpful tool in computer vision.

DC Microgrid for Solar and Wind Power Integration
Authors:- Ashok Singh Bhauryal, Nisha Kaintura, Tanya, Yash Pratap Singh

Abstract-Micro-grid systems are presently considered a reliable solution for the expected deficiency in the power required from future power systems. Renewable power sources such as wind, solar offer high potential of benign power for future micro-grid systems. Micro-Grid (MG) is basically a low voltage (LV) or medium voltage (MV) distribution network which consists of a number of called distributed generators (DG’s); micro-sources such as photovoltaic array, wind turbine etc. energy storage systems and loads; operating as a single controllable system, that could be operated in both grid-connected and islanded mode. The capacity of the DG’s is sufficient to support all; or most, of the load connected to the micro-grid. This paper presents a micro-grid system based on wind and solar power sources and addresses issues related to operation, control, and stability of the system. Using Matlab/Simulink, the system is modeled and simulated to identify the relevant technical issues involved in the operation ofwa micro-grid system based on renewable power generation units.

Face Recognition using Deep Neural Network
Authors:- Research Student Amritpal Kaur, Asst.Prof. Shaveta Bala

Abstract- Face recognition is one of the fundamental challenges in the various application of computer vision and in the pattern recognition. First step of this process is to detect the facial feature in a video or in digital images. Next step is to recognize the person present in the frame by comparing its facial features with the features present in the database. For this step various types of classifiers are used for extraction and reducing the number of facial features. Various types of learning techniques were built in last two decades for face detection and recognition. Holistic learning, local handcraft learning and shallow learning are few examples of these techniques. In the last decade deep learning has shown the great improvement in the field of face recognition. Here convolutional neural network is used to learn the features of the object. In this paper a novel deep neural network technique with back propagation is proposed to identify and recognize the faces of various famous persons. Various objective parameters like precision, recall and F1 score is used to evaluate the performance of the proposed technique.

A Study On Anti Ramsey Coloring Problems
Authors:- M.Phil Scholar M. Susila, Asst.Prof. A. Mallika

Abstract- Let ar(G, H) be the maximum number of colors such that there exists an edgecoloring of G with ar(G, H) colors such that each subgraph isomorphic to H has atleast two edges in the same color. We call ar(G, H) the Anti-Ramsey number for a pair of graphs ar(G, H). In this paper, we determine the Anti-Ramsey number for special graphs.

Inter-laminar Fracture of Composites Materials for Aerospace Structures
Authors:- Research Scholar Imran Abdul Munaf Saundatti, Prof. Dr G. R. Selokar (Supervision)

Abstract- The point of the present research is to pick up a superior comprehension of inter-laminar facture of polymer framework composites in various modes, and to create scientific model to anticipate the critical strain energy discharge rates. Accentuation has been set on the root revolution at the crack tip which was accepted to be a critical factor which influences the delaminating fracture toughness, and critical burden. A joined experimental and hypothetical investigation has been directed to decide the job of root revolution on critical burden. The objective of anticipating the reliance of root pivot on critical strain energy discharge rate under mode I is accomplished. The initial segment of the present examination analyzes inter-laminar fracture toughness of Double Cantilever Beam (DCB) examples dependent on a changed Timoshenko beam model.

Grid connected Solar Powered Water Pumping System Utilizing Improved Control Technique
Authors:- Suvek Kumar, Prof. Vinay Pathak

Abstract- Present paper aims to discuss scope and limitations of photovoltaic solar water pumping system. Components and functioning of PV solar pumping system are described. In addition, review of research works of previous noteworthy researchers has also been done. Irrigation is well established procedure on many farms in world and is practiced on various levels around the world. It allows diversification of crops, while increasing crop yields. However, typical irrigation systems consume a great amount of conventional energy through the use of electric motors and generators powered by fuel. Photovoltaic energy can find many applications in agriculture, providing electrical energy in various cases, particularly in areas without an electric grid. This paper proposes a single stage grid interactive solar powered switched reluctance motor (SRM) driven water pumping system with an efficient control technique. The control of proposed system provides the proficient maximum power point technique (MPPT) tracking and motor drive control with bidirectional power flow between the photovoltaic (PV) array and single phase grid. It has harmonics components elimination, improved dynamic performance and a DC offset rejection capability compared to other control. A PV feedforward term is also incorporated in developed control to enhance the dynamic performance of the system and to minimize the size of DC link capacitor with improved MPPT performance. The novel scheme of fundamental switching of SRM drive over its maximum operational time (when the grid is present) makes system efficient and reliable. An improved perturb and observe (P&O) based maximum power point tracking (MPPT) algorithm is used in this system to minimize the undesirable losses in a PV array specially under varying insolation levels. The proposed control is tested on a developed prototype and its suitability is authenticated through simulated and test results under various conditions.

A Review on Grid Connected Hybrid Renewable Energy System Using Dynamic Voltage Restorer
Authors:- Gyanoday Kumar, Prof. Vinay Pathak

Abstract- This paper presents a new system for integration of a grid-connected photovoltaic (PV) system together with a self supported dynamic voltage restorer (DVR). Power quality (PQ) is gaining a great deal of importance as more sensitive loads are introduced into the utility grid. The degradation of product quality, damage of equipment and temporary shutdowns are the general issues associated with PQ problems in industries. Any mal-operation or damage of the industrial sensitive loads results in monetary losses disproportionately higher than the severity of the PQ issues. The evolution of power electronics technology replaced the traditional power quality mitigation methods with the introduction of Custom Power System devices (CUPS). The major power electronic controller based CUPS are DSTATCOM, DVR and UPQC. DVR is a pertinent solution for the economic losses caused by the PQ issues in the industries. Among the CUPS, DVR is the most cost-effective one. In the published literature, only a few papers correspond to the review of DVR technology. In this paper, a systematic review of published literature is conducted and a description is given on the design, standards and challenges in the DVR technology. In addition to the energy variability of renewable energy sources, random voltage sags, swells and disruptions are already a major issue in power systems. Recent advances in power electronic devices have provided a platform for new solutions to the voltage support problem in power systems.

Modelling of Solar and Grid Connected System Based on Bidirectional DC to DC Converter
Authors:- M.Tech Scholar Vikas Kumar, Prof. Vinay Pathak

Abstract- The goal of this paper is to create and build a maximum power point tracker that uses fuzzy logic control methods. For such nonlinear situations, fuzzy logic makes an ideal controller. This method also takes advantage of artificial intelligence techniques that can help model nonlinear systems despite their complexity. To make this project a success, I created and simulated an MPPT system made up of photovoltaic modules. MPPT works by using a tracking algorithm to discover and sustain operation at the greatest power point. Due to changes in temperature, solar radiation, and load, the photovoltaic module’s maximum power will fluctuate. A maximum power point tracker (MPPT) is used in the photovoltaic system to continually harvest the highest power from the solar panel and then transfer it on to the load in order to maximize efficiency. A DC-DC converter (an electrical device that transforms DC energy from one voltage level to another) and a controller, as well as DC converters, batteries, and fuzzy logic controllers, make up the general structure of the MPPT system. To determine the best topology for the PV system, characterise the buck, boost, and buck-boost converters. In the MATLAB Simulink system, the integrated model of the PV module with the indicated converter and battery will be simulated.

Performances of Hybrid Renewable Energy Based Electrical Charging Station
Authors:- M.Tech Scholar Pooja Tiwari, Prof. Vinay Pathak

Abstract- Electric vehicles (EVS) represent one of the most promising technologies to green the transportation systems. An important issue is that high penetration of evs brings heavy electricity demand to the power grid systems became an important solution to reach the remote area and maximizing the economic, technological, and environmental benefits. In this thesis, A combination of solar energy, diesel generator, and electric vehicle gave an excellent result to ensure an uninterruptible power supply in case of low irradiance of PV solar energy. The main element is a photovoltaic system that is designed to satisfy the daily load energy requirement. A three-phase active filter is used to improve the power quality, manage the power, and corrected the unbalance. Backup energy storage systems including plug-in hybrid electric vehicles and the diesel generator are used to ensure an uninterruptible power supply in case of low solar irradiation. An effective way to reduce the impact is to integrate local power generation such as renewable energy (RES) into the charging infrastructure. Due to the intermittent and indivisible nature of RES, it has become very challenging to coordinate the charging of electric cars with other grid loads and renewable energy. This studies the charging of electric vehicles with smart grid technology and reviews its interaction with renewable energy. First introduces electric cars and renewable energy, which mainly introduces the main types of electric vehicles and the estimation method for renewable energy. In line with the objectives, the existing research work is divided into three categories: cost awareness, efficiency awareness and emission awareness of the interaction between electric cars and renewable energy. Each discussion category contains a description of the core idea, an overview of the solution and a comparison between different works. Finally, some important open-ended questions related to the interaction between electric cars and RES are given, and some possible solutions are discussed. To take care of the battery life, the PHEV supplies power to the load only during emergencies. This motivates the development of this work to the used robust algorithm, sizing, and energy management to balance the load consumption and electricity production this simulation has performed on MATLAB Simulink.

Implementation of Heuristic Methods in Manufacturing Industry
Authors:- M. Tech. Scholar Shashank Mishra, Prof. Hari Mohan Soni

Abstract- The Assembly Line Balance (ALB) is known as the classic problem of AL balancing, consisting in the allocation of tasks on a workstation in a way that downtime is minimized, and the precedence constraints are met. The ALB allows achieve the best use of available resources so that satisfactory production rates are reached at a minimum cost. The balancing is necessary when there are process changes, such as adding or deleting tasks, change of components, changes in processing time and also in the implementation of new processes.

Review of Design multiplexer using QCA
Authors:- M.Tech. Scholar Rajesh Kumar, Asst. Prof. Mr. K. K. Sharma

Abstract-A novel design of a quantum-dot cellular automata (QCA) 2 to 1 multiplexer is presented. The objective is the development of a modular design methodology which can be used to design 2n to 1 multiplexers using building blocks. For the QCA implementation a careful consideration is taken into account concerning the design in order to increase the device stability. The proposed multiplexer is designed and simulated using the QCADesigner tool.

Improving the Performance of Neural Networks
Authors:- Satwik Ram Kodandaram

Abstract- Deep Learning is a sub-part of Machine Learning where we exactly mimic the human brain neural network system. Deep Learning models are nonlinear models. They offer increased flexibility and can scale in proportion to the available training dataset. The downside of this flexibility is weights are calculated and updated via a stochastic training algorithm which means that they are sensitive to the training data and may have a different set of weights upon each time they are trained and produce different predictions. Generally, this case is referred to as neural networks with high variance and it will be very difficult to produce a final model for predictions. Deep Learning models often take too much time to train which means we require high computation resources like GPU or TPU. After investing so much time and resources, there is no guarantee that the final model will have low generalization error when performing on the unseen dataset. To overcome this, we need to reduce the variance of the model. A successful approach to reduce the variance is to go for “ensemble learning”. In this paper, we will discuss different methods of “ensemble learning” to improve the accuracy of the deep learning model by reducing the variance.

A Review on Experimental Investigation of Surface Roughness & Material Removal Rate of EN-31 Alloy Steel
Authors:- M.Tech. Scholar Rahul Singh, Asst. Prof. Abhishek Singh Roha

Abstract- This paper investigates the influence of machining parameters on MRR and surface roughness during CNC turning of EN-31 Steel using tungsten carbide inserts. Three machining parameters were taken. Taguchi robust design of technique is used. L9 orthogonal array was used. S/N ratio and ANOVA method were used to find mean response and percentage contribution. From the experimental result it is concluded that cutting speed is most significant effect on surface roughness and MRR.

A Review on Errors Caused in Infrared Thermography Measurements
Authors:- M.Tech Scholar Neeraj Kumar Dubey, Prof. Nitin Jaiswal

Abstract- Infrared thermography in its process uses thermal imager to detect radiation and then further converting it to get object temperature and temperature distribution. The results of thermography measurement get affected by various parameters say emissivity, ambient temperature, atmospheric temperature, transmittance, relative humidity, distance and view factor between object and sensor. Parameters such as emissivity, ambient temperature, transmittance, relative humidity, distance between object and sensor are user-specified to the measurement software. The present work focuses on review of errors caused in infrared thermography measurements.

Implementation of Greenhouse Service Control Protocol using Raspberry-Pi
Authors:- Mohammed Ameen Uddin, Shanila Mahreen, Mohd Anas Ali

Abstract- – The term “greenhouse” refers to a controlled atmosphere in which plants are cultivated. To achieve optimal plant development, greenhouse systems must continuously monitor and regulate environmental factors such as temperature, soil moisture, light intensity, humidity, and others. A greenhouse provides a year-round climate for growing plants, even on cold, gloomy days. This project’s major goal is to develop a basic, low-cost system that continually updates and controls the value of environmental parameters in order to ensure optimal plant development. Precision agriculture uses a variety of approaches to monitor and regulate the environment for the growth of numerous crops. It is difficult to meet the needs of farmers to manage water evenly due to the unequal distribution of rain water. This necessitates various irrigation methods that are suited for every weather condition, soil type, and diversity of crops. Finding a strategy that provides flawless analysing and regulating in order to build a proper atmosphere is more vital. Agriculture is one of the many areas where ICT technology is frequently used. The majority of equipment and greenhouses in the agriculture industry still rely on outdated serial connection methods. Several technical implementations of communications and information, such as internet and Bluetooth are becoming more widely used, yet they are still incompatible. Korea is working on a set of standards to ensure that various vendors can communicate with one another. For Protocol of Link-Control to be standardized, which is not dependent on infrastructure of network underlying, may be used to offer fundamental interoperability. We created a protocol of controlling the service on basis of protocol of Link-Control and implemented it using Python in this article.

Improving the Performance of Neural Networks
Authors:- Satwik Ram Kodandaram

Abstract- Deep Learning is a sub-part of Machine Learning where we exactly mimic the human brain neural network system. Deep Learning models are nonlinear models. They offer increased flexibility and can scale in proportion to the available training dataset. The downside of this flexibility is weights are calculated and updated via a stochastic training algorithm which means that they are sensitive to the training data and may have a different set of weights upon each time they are trained and produce different predictions. Generally, this case is referred to as neural networks with high variance and it will be very difficult to produce a final model for predictions. Deep Learning models often take too much time to train which means we require high computation resources like GPU or TPU. After investing so much time and resources, there is no guarantee that the final model will have low generalization error when performing on the unseen dataset. To overcome this, we need to reduce the variance of the model. A successful approach to reduce the variance is to go for “ensemble learning”. In this paper, we will discuss different methods of “ensemble learning” to improve the accuracy of the deep learning model by reducing the variance.

Survey of Dc-Dc Converters for Dc Nano-Grid with Solar PV Generation
Authors:- PG Scholar Poonam Singh, Asst. Prof. Abhijeet Patil, Associate Prof. Dr E.Vijay Kumar

Abstract- The wide use of DC characterized loads and more distributed power generation sources (DERs), the DC Nanogrid becomes more and more popular and seen as an alternative to the AC grid system in future. Therefore for safety considerations, DC Nano grid provides reliable grounding for residential loads like low voltage AC power system. Nano grid is a self-controlled entity and operated in either grid connection or island mode which connects local distributed energy sources and local distributed system. In this paper the review of performance analysis of DC-DC converters used in Nano grid is proposed. DC-DC converters are used for maintaining the voltage level of the system according to load demand.

Review Paper on Design of Vortex Tube Refrigeration
Authors:-Prof. E. L. Manjerekar, Faizan Girkar, Prakash More, Hanish Parab, Siddharth Parab

Abstract- The Ranque-Hilsch vortex tube has been used for many decades in various engineering applications. Because of its compact design and little maintenance requirements, it is very popular in heating and cooling processes. Despite its simple geometry, the mechanism that produces the temperature separation inside the tube is fairly complicated. A number of observations and theories have been explored by different investigators concerning this phenomenon. This report goes over some of the major conclusions found from experimental and numerical studies since the vortex tube’s invention. One of these studies showed that acoustic streaming caused by vortex whistle plays a large part in the Ranque-Hilsch effect.

Performance Evaluation of a Self-Excited Induction Generator for Stand-Alone Wind Energy Conversion System
Authors:-Ruqaya Mohiudin, Priya Sharma

Abstract- This paper presents the performance characteristics of a self-excited induction generator (SEIG) under various operating conditions. This also explains the modeling of parallel equivalent circuit to evaluate the reactive power required for SEIG. The variation of terminal voltage has been studied by varying the shaft speed, capacitance value and load. The simulations are carried through MATLAB/SIMULINK environment, and the validation of the simulation results are established through an experimental set-up.

Deep Learning Approach for the Detection of Breast Cancer
Authors:-Research Scholar Sapna Bansal, Professor Dr. Rohit Kumar Singhal

Abstract- About 2.1 million women are diagnosed with breast cancer annually, making it the most frequent sort of cancer among women. The aim is to raise the percentage of early breast cancers, enabling more effective treatment and a reduced risk of breast cancer death as a result of the disease. We use a number of machine learning approaches to assess whether a tumor’s traits are benign or malignant. Digital biomedical photos, such as histopathological photographs, are utilized in large sections by doctors to diagnosis cancer since they are so accurate. The analysis of histological pictures is a time- consuming technique that demands practically always the employment of expertise. Conversely, computer-aided diagnostic (CAD) systems can assist clinicians establish more accurate diagnoses. The Deep Neural Network (DNN) for biological image processing has lately demonstrated to be at the forefront of technology. In general, each image consists of a mix of structural and statistical information. The current work contained a collection of biological breast cancer photos and used DNN approaches to categorize photographs on the basis of structural & numerical data from imaging shots. SVM, RF and CNN approaches are compared for categorizing photos of breast cancer. The purpose of this investigation is to find out if the hypothesis is accurate or not. The degree of accuracy for this investigation was 98.00 percent.

Impacts of Bullying on students
Authors:-Kuenga Dendup

Abstract- This research was carried out in one of the primary schools in Tsirang involving 30 students, 15 boys and 15 girls, from Classes IV-VI. Participants were aged between 11 and 15 years of age, mean age of 13 years. Besides the quantitative, the study uses qualitative data from a focus group discussion (FGD), attended by 15 students, seven girls and eight boys, whose ages range from 11 to 16 years. A total of 45 students contributed to this study. This study aimed to review, understand and analyze the literature about bullying behaviours of school children and to find out the effect of bullying on students and gauge how that would affect their interest in coming to school every day. It was also to find out how bullying can sometimes lead to low self-esteem and underperformance academically at school and to identify what educators can do to create a bully-free school.

Seismic Analysis of Multistorey Building with Floating Column
Authors:-M.Tech. Scholar Adnan Ahmed, Dr. P.K. Singhai

Abstract- Structural planning and design is an art and science of designing with economy and elegance and durable structures. In present scenario buildings with floating column is a typical feature in the modern multistory construction in urban India. Such features are highly undesirable in building built in seismically active areas. Tremendous increase in the use floating column can be seen these days cause of spacious and aesthetic appearance but that could not be achieved on the risk of failure of building. This study highlights the importance of explicitly recognizing the presence of the floating column in the analysis of building. The study is carried out to analyze the building with floating columns and to find out its comparison with the building without floating column in terms of storey drift, base shear and time period frequency using designing software.

Seismic Retrofitting of Reinforced Concrete Structures
Authors:-M.Tech Scholar Md Aamir Sohail, Prof.Vijay Kumar Meshram

Abstract-Earthquake around the world is one of the reasons responsible for the destruction to life and property in large numbers. In order to mitigate such hazards, it is important to incorporate norms that will enhance the seismic performance of the structures. Earthquake loads are required to be carefully modeled so as to assess the real behavior of structure with a clear understanding that damage is expected but it should be regulated. Seismic Retrofitting is the modification of existing structures to make them more resistant to seismic activity, ground motion, or soil failure due to earthquakes. In this project our aim is to analyze an existing building using STAAD Pro v8i, with and without the provision of seismic retrofitting. The structure is analyzed in STAAD Pro v8i and the bending moment was chosen as the criteria for selecting the weak member. RC jacketing was selected as the retrofitting technique employed to the weak member andlater the member in the structure was compared with the bending moment value before and after providing retrofitting. It was determined that RC jacketing strengthened the structure, which was vulnerable to seismic activity.

Analysis of Major Elements of Elevated Metro Bridge
Authors:-M.Tech Scholar Mohammad Ammar, Prof. Vijay Kumar Meshram

Abstract-An elevated metro system is more preferred type of metro system due to ease of construction and also it makes urban areas more accessible without any construction difficulty. An elevated metro system has two major elements pier and box girder. This research concentrates only on the design of pier and its performance. Conventionally the pier of a metro bridge is designed using a force based approach. During a seismic loading, the behaviour of a single pier elevated bridge relies mostly on the ductility and the displacement capacity. It is important to check the ductility of such single piers. Force based methods do not explicitly check the displacement capacity during the design. Conventionally the pier of a metro bridge is designed using a force based approach. During a seismic loading, the behavior of a single pier elevated bridge relies mostly on the ductility and the displacement capacity. It is important to check the ductility of such single piers. Force based methods do not explicitly check the displacement capacity during the design. The codes are now moving towards a performance-based (displacement-based) design approach, which consider the design as per the target performances at the design stage. Performance of a pier designed by a Direct Displacement Based Design is compared with that of a force-based designed one. , performance of a pier designed by a Direct Displacement Based Design is compared with that of a force-based designed one. The design of a pier is done by both force based seismic design method and direct displacement based seismic design method and performance assessment is done based on both the methods.

Vibration and Buckling Analysis of Cracked Composite Beam

Authors:-M.Tech. Scholar Abuzar Khan, Dr. P.K. Singhai

Abstract-Cracks in structural members lead to local changes in their stiffness and consequently their static and dynamic behaviour is altered. The influence of cracks on dynamic characteristics like natural frequencies, modes of vibration of structures has been the subject of many investigations. However studies related to behavior of composite cracked structures subject to in-plane loads are scarce in literature. Present work deals with the vibration and buckling analysis of a cantilever beam made from graphite fiber reinforced polyimide with a transverse one-edge non-propagating open crack using the finite element method. The undamaged parts of the beam are modeled by beam finite elements with three nodes and three degrees of freedom at the node. Anoverall additional flexibility matrix‟ is added to the flexibility matrix of the corresponding non-cracked composite beam element to obtain the total flexibility matrix, and therefore the stiffness matrix in line with previous studies. The vibration of cracked composite beam is computed using the present formulation and is compared with the previous results. The effects of various parameters like crack location, crack depth, volume fraction of fibers and fibers orientations upon the changes of the natural frequencies of the beam are studied. It is found that, presence of crack in a beam decreases the natural frequency which is more pronounced when the crack is near the fixed support and the crack depth is more. The natural frequency of the cracked beam is found to be maximum at about 45% of volume fraction of fibres and the frequency for any depth of crack increases with the increase of angle of fibres. The static buckling load of a cracked composite beam is found to be decreasing with the presence of a crack and the decrease is more severe with increase in crack depthfor any location of the crack. Furthermore, the buckling load of the beam decreased with increase in angle of the fibres and is maximum at 0 degree orientation.

Seismic Risk Assessment of RCC Framed Structure with Vertically Irregular Buildings Shaped
Authors:-M.Tech. Scholar MD Arif Mansoori,Prof.Vijay Kumar Meshram

Abstract-The area of vertically irregular type of building isnow having a lot of interest in seismic research field. Many structures are designed with vertical irregularity for architectural views. Vertical irregularity arises in the buildings due to the significant change instiffness and strength. Open ground storey (OGS) is an example of anextreme case of vertically irregularity. The typical OGS andstepped types of irregularities are considered in the present study.

Experimental Investigation on Al-6061 for MRR and Surface Roughness Using MAFM Technique
Authors:-M. Tech Scholar Mohit, Assistant Professor Manoj

Abstract-MAFM is an innovative expansion in AFM. By means of magnetically fielding in the region of the work portion in AFM, we can amplify the material removal rate in addition to the plate finishing. MAFM is a dug in refined finishing up technique capability of meted the changed closing necessities via a different sections of use like aviation, wellbeing and vehicle. It is commonly helpful to end composite figures for improved surface unevenness esteems and unbending abstinences. Be that as it may, the principal disadvantage of this methodology is short closing rate. The unrivalled introduction is practiced if the system is controlled on the web. Thus, sound related discharge technique is tried to investigate the exterior complete and material rejection. A variety of demonstrating techniques are likewise practiced to display the methodology and to connect with investigational results. Yet, pros guess that there is still extension for an arrangement of flawlessness in the close-by MAFM review. In the current effortAl-6061 is punctured & exhausted by customary machined capacity & surface finishing up was made by methods for rough stream machining. Testing was grasped for information requirements like rough pondering, grating system degree and no of cycles.

Biometrics Authentication Systems
Authors:-Gita Roy

Abstract-Biometrics are body estimations and computations identified with human attributes. Biometric confirmation (or sensible verification) is utilized in software engineering as a type of recognizable proof and access control. It is additionally used to recognize people in bunches that are under observation. Biometric identifiers are the particular, quantifiable qualities used to name and depict people. Biometric identifiers are regularly sorted as physiological qualities, which are identified with the state of the body. Models incorporate, yet are not restricted to finger impression, palm veins, face acknowledgment, DNA, palm print, hand calculation, iris acknowledgment, retina and smell/aroma. Social qualities are identified with the example of conduct of an individual, including however not restricted to composing musicality, stride, keystroke, signature, conduct profiling, and voice. A few scientists have instituted the term ‘biometrics’ to depict the last class of biometrics.

Study on Torsional Behavior of RCT- Beams Strengthened with Glass FRP
Authors:-M.Tech Students Mohd Ahzam Imran, Asst. Prof.Vijay Kumar Meshram

Abstract-Environmental degradation, increased service loads, reduced capacity due to aging, degradation owing to poor construction materials and workmanships and conditional need for seismic retrofitting have demanded the necessity for repair and rehabilitation of existing structures. Fibre reinforced polymers has been used successfully in many such applications for reasons like low weight, high strength and durability. In the present work experimental study was conducted in order to have a better understanding the behavior of torsional strengthening of solid RC flanged T-beams. An RC T-beam is analyzed and designed for torsion like an RC rectangular beam; the effect of concrete on flange is neglected by codes. In the present study effect of flange part in resisting torsion is studied by changing flange width of controlled beams. The other parameters studied are strengthening configurations and fiber orientations. The aim of present work is to determine quantitatively the effectiveness of GFRP to be used as external lateral reinforcements to flanged T-beams subjected to torsion. Experimental results obtained from GFRP strengthen beams are compared with un-strengthen control beams. The study shows remarkable improvement in torsional behavior of all the GFRP strengthens T-beams. The experimentally obtained results are validated with analytical model presented by A. Deifalla and A. Ghobarah and found in good agreement.

Well Productivity Optimization
Authors:-MBA. Jorge Vargas

Abstract-Determine, optimize, implement, and follow up operational strategies, designs and engineering, to obtain efficient and maximized well intervention programs and their artificial lifting systems through the acquisition of information related with fluids, bottom hole pressures, pressure restoration factors, and optimum well operating conditions.

Productivity-Collaboration and Integration of Functional Processes in Companies of the Oil and Gas Sector
Authors:-MBA.Jorge Vargas

Abstract-Productivity Collaboration and Integration of Functional Processes in Companies in the Oil and Gas Sector, to develop and professionalize the “Collaboration Centers” to follow up and manage factors, scenarios, current and future operating conditions of the operation and its processes and workflows (exploration, exploitation design, reservoirs, drilling, completion, production, workover, surface facilities, construction, logistics, transport, maintenance, safety, occupational health and the environment).

Application of Double Ribbed Twisted Tapes in Heat Transfer Enhancement of Tubular Heat Exchanger
Authors:-M. Tech. Scholar Umesh Kumar Yadav, Asst. Prof. Saumitra Kumar Sharma

Abstract-Nowadays, heat exchangers with twisted-tape inserts have widely been applied for enhancing the convective heat transfer in various industries such as thermal power plants, chemical processing plants, air conditioning equipment, refrigerators, petrochemical, biomedical and food processing plants. In general, twisted tape insert introduces swirl into the bulk flow which consequently disrupts a thermal boundary layer on the tube surface. Recently, the use of twisted tape with cuts and holes becomes popular due to their thermal performance improvement in comparison with other types of twisted tape and several studies have been carried out on these types of modified twisted tape. This work aims to present a numerical model for heat transfer intensification in a heat exchanger tube equipped with novel V-cut twisted tape. The effects of different cut ratios (0.6<b/c<1.25) on the turbulent flow characteristics and thermal performance of the system will be investigated over the Reynolds number range from 4000 to 12000. All the simulation will be performed for fully developed turbulent flux in the Reynolds number range with uniform heat flux of 5000 W/m2.The numerical results of heat transfer (Nusselt number, Nu), pressure drop (friction factor, f) and enhancement Performance Factor in a tube with twisted tapes (V-Cut) were reported in the study.

Study and Optimization of Defects in Casting Used in Foundry with the Use of Six Sigma Methodology
Authors:-M. Tech. Scholar Shubham Verma, Asst. Prof. Vivek Singh, Prof. Rajesh Rathore, Asst. Prof. Virendra Dashore

Abstract-Casting industries play an important part in the manufacturing industry. Complex form and size goods are created in a single procedure that cannot be produced in other manufacturing methods. Because the other method requires more than one step to transform a raw material into a finished product. The casting’s quality should be maintained without flaws throughout production. This is not feasible since we cannot achieve a 100% accuracy rate. However, some quality control instruments and methods may assist to decrease the proportion of faults. The primary goal of this study is to minimize the shrinkage fault that occurs in the External Bearing Ring of ductile cast iron manufactured in Pithampur, Indore’s premier casting Renuka factory. From the industry we collect the four months data of production and production defects data in product casting. The data was gathered from the industry over a six-month period, and the flaws were discovered using the Six-Sigma DMAIC (Define, Measure, Analyse, Improve, Control) method. Quality control tools are used at various phases of the DMAIC method to detect and control problems. In addition, the Taguchi method is used to generate the L9 orthogonal array from the Minitab programme. Finally, the optimum solution is developed and recommended to the industry for defect reduction.

Blockchain in the KYC Process – A Case Study
Authors:-Abhishek Oberoi, Bhargav Patel, Anas Mansuri

Abstract-This paper deals with the appropriateness of the blockchain technology to improve existing KYC procedures, which are often described as lengthy, costly and cumbersome. Moreover, similar identification processes need to be carried out repeatedly for several institutions, which creates considerable inefficiencies and avoidable costs. The use of a blockchain design with smart contracts offers the possibility to avoid redundant workflows and entails several benefits such as enhanced security, trust and flexibility. This illustrates that the blockchain technology, which is still in a maturing phase, has the potential to play an important role in streamlining and (to some extent) automating current KYC processes. In terms of security, trustworthiness or customer satisfaction, the technology may offer game changing opportunities (not only) in the realm of authenticated user identification or digital identity management.

A Review on Heavy Metal Pollution of Holy River Ganga
Authors:-Dr. Pushpraj Singh

Abstract– The Ganga, is one of the most sacred and worshipped river of India, is regarded as the cradle of Indian civilization. Ganga River is a source of life but contamination of water is the major threat in today’s India. The industrial, municipal and agricultural wastes contain large amount of organic and inorganic materials and itleads to water pollution, which contains, variable amounts of heavy metals, some of them are potentially toxic and may affect human health and health of aquatic system. Many natural and anthropogenic sources caused heavy metal pollution into water. The concentrations of heavy metals determined were more than the maximum admissible and desirable limit when compared with the National and International organizations like CPCB (Central Pollution Control Board), ISI (Indian Standard Institution), ICMR (Indian Council of Medical Research), WHO (World Health Organization) and USEPA (United States Environmental Protection Agency). Exposure to heavy metals has been linked to chronic & acute toxicity developing retardation, neurotoxicity, kidney damage, various cancers, liver damage, lung damage, fragile bones and even death in instances of very high exposure. The major objective of this review paper is the finding of the work carried out by the many scientists, environmentalists and researchers in the past on the heavy metal pollution of holy river Ganga.

Autonomous Energy-Efficient Wireless Sensor Network Platform for Home/Office Automation
Authors:-Syed Ghouse Mohiuddin, Mr. Dargah Akbar Hussain, Mohd Anas Ali

Abstract- The Self-driving car is an autonomous robot that navigates to its destination without human operator. The aim of this project is to make an efficient LIDAR sensing system for Self-driving cars that is capable of mapping its surroundings, navigating through the path, and reaches the destination automatically. Through scan matching, the robot detects a previously visited location and creates one or more loop closures along its path. To plan a path through an environment effectively a probabilistic roadmap (PRM) identify an obstacle-free path from a start to an end point, the PRM method employs a network of connected nodes. The obstacle locations given in the Map are used to link the nodes. The findings are shown on a low-cost Autonomous RC Robot that runs on the ROS Kinetic running on a Raspberry Pi and YDLIDAR X2 in the front top part. This low-cost autonomous bot is equipped with capabilities such as Simultaneous Localization and Mapping, Path Planning and Following, allowing it to autonomously reach its destination once it is marked on the map.

Crop Infection Detection Using Yolo
Authors:-Satwik Ram Kodandaram, Kushal Honnappa, Parikshith H, Sandesh, 5 Kushal C

Abstract-Agriculture is the backbone of a country. It is important to note that without agriculture, there is no economic growth in the country. As Technology has improved a lot and improving a lot day by day, these technologies can be utilized in farming and agriculture so that there will be maximum utilization of crops and less wastage of crops. To achieve this, we need to come across a few challenges. Which crops can be grown depending on certain weather conditions? Identification of disease in crops so that we can prevent it and maximum yield of crops. Prevention is better than cure the famous quote says. Artificial Intelligence is one of the greatest inventions, using AI we can train the machine with images to detect disease in crops. The problem of the underutilization of crops can be achieved. This paper proposes a model for implementing crop infection detection and maximum yield of crops using Convolution Neural Networks (CNN) and You Look Only Once(YOLO).

Detection of Sickle Cell Anemiafromblood Smeared Images Using CNN Algorithmin Image Processing
Authors:-Lecturer Dinesh kumar S.

Abstract-Human blood consists of 3 kinds of major cells: Red blood cells, White blood cells and blood platelets. Erythrocyte malady could be a cluster of disorders that affects hemoglobin, the molecules in red blood cells that delivers element to cells throughout the body. This is known as sickle cell anemia. In sickle cell anemia, the blood contains unusual hemoglobin molecules referred to as hemoglobin S, which misshapes red blood cells into a reaping hook, or crescent shape. Sickle cell anemia is a hereditary form of anemia in which mutated hemoglobin distorts the red blood cell into sickle shaped cells due to low oxygen levels. Signs and symptoms of erythrocyte malady i.e., sickle cell disease typically begin in infancy. Detection of sickle cell anemia emphasizes the analysis for accurate disease diagnosis. It is being done using CNN algorithm in image processing.To perform the segmentation of the images, techniques such as Plane Extraction, Arithmetic operations, Linear distinction Stretching, bar graph feat and world Thresholding and Gray Level Co-occurrence Matrix employed for classification.

An Efficient Lidar Sensing System for Self-Driving Cars
Authors:-Syed Ghouse Mohiuddin, Mr. Dargah Akbar Hussain, Mohd Anas Ali

Abstract-The Self-driving car is an autonomous robot that navigates to its destination without human operator. The aim of this project is to make an efficient LIDAR sensing system for Self-driving cars that is capable of mapping its surroundings, navigating through the path, and reaches the destination automatically. Through scan matching, the robot detects a previously visited location and creates one or more loop closures along its path. To plan a path through an environment effectively a probabilistic roadmap (PRM) identify an obstacle-free path from a start to an end point, the PRM method employs a network of connected nodes. The obstacle locations given in the Map are used to link the nodes. The findings are shown on a low-cost Autonomous RC Robot that runs on the ROS Kinetic running on a Raspberry Pi and YDLIDAR X2 in the front top part. This low-cost autonomous bot is equipped with capabilities such as Simultaneous Localization and Mapping, Path Planning and Following, allowing it to autonomously reach its destination once it is marked on the map.

Autonomous Energy-Efficient Wireless Sensor Network Platform for Home/Office Automation
Authors:-Shaik Mohammed Shahed, Mohd Abdul Sattar, MohdAnas Ali

Abstract-Smart homes and workplaces can aid people in living and working more comfortably with WSNs. Sensors, microcontroller, radio, and antenna are used in these applications to regularly detect, data from a dispersed network of low-power, low-cost, highly energy-efficient electronic platforms to a distant host station for pre-processing and transmission. To address future Internet-of-things (IoT) application requirements, an integrated photovoltaic panel with a rechargeable battery and a power-efficient architecture is provided, which necessitates a large number of interconnected wireless networks being designed and implemented to be energetically self-sufficient.

Bio-Geography Based Page Prediction Using Web Mining Feature
Authors:-Trivene Khede, Dr. Avinash Sharma

Abstract-Website is god place to reach the audience of any field. Many of companies are using this platform for different business. Retaining a web visitor on website depends on available content and intelligence of site. This paper has developed a intelligent model that can predict the web page by understanding the behavior of the user. Biogeography optimization genetic algorithm was used to predict the web page as per past user visits. This work uses web content and web log feature of the website for evaluating the fitness value of genetic algorithm chromosomes. Experiment was done on real dataset with different size. Result shows that proposed model has improved values of different evaluation parameters.

Optimization of Hybrid Renewable Energy Systems (HRES) Using PSO
Authors:-M.Tech. Scholar Anit Kumar Vaishya, Prof. Vinay Pathak

Abstract-Present paper aims to discuss scope and limitations of photovoltaic solar water pumping system. Components and functioning of PV solar pumping system are described. In addition, review of research works of previous noteworthy researchers has also been done. Irrigation is well established procedure on many farms in world and is practiced on various levels around the world. It allows diversification of crops, while increasing crop yields. However, typical irrigation systems consume a great amount of conventional energy through the use of electric motors and generators powered by fuel. Photovoltaic energy can find many applications in agriculture, providing electrical energy in various cases, particularly in areas without an electric grid. This thesis proposes a single stage grid interactive solar powered switched reluctance motor (SRM) driven water pumping system with an efficient control technique. The control of proposed system provides the proficient maximum power point technique (MPPT) tracking and motor drive control with bidirectional power flow between the photovoltaic (PV) array and single phase grid. It has harmonics components elimination, improved dynamic performance and a DC offset rejection capability compared to other control. A PV feedforward term is also incorporated in developed control to enhance the dynamic performance of the system and to minimize the size of DC link capacitor with improved MPPT performance. The novel scheme of fundamental switching of SRM drive over its maximum operational time (when the grid is present) makes system efficient and reliable. An improved perturb and observe (P&O) based maximum power point tracking (MPPT) algorithm is used in this system to minimize the undesirable losses in a PV array specially under varying insolation levels. The proposed control is tested on a developed prototype and its suitability is authenticated through simulated and test results under various conditions.

Home Automation Based on IOT
Authors:-Ankita Jaiswal, Mr. Shailendra Singh Bhalla

Abstract- Home automation in order to help maintain comfortable living conditions within a home. One can achieve home automation by simply connecting home appliance electrical devices to the internet or cloud storage. The reason for this surge demand of network enabled home automation is reaching the zenith in recent days for its simplicity and comparable affordability. Platforms based on Internet of Things help to connect to the things surroundings everyone so that one can find it easy to access anything and everything at any time and place in a user friendly manner using custom defined portals. The most significant ones are the thermal comfort, which is related to temperature and humidity, followed by the visual comfort, associated with air quality. The proposed design uses the platform for collecting and visualizing monitored data and remote controlling of home appliances and devices. The selected platform is very flexible and user-friendly. The most significant ones are the thermal comfort, which is related to temperature and humidity, followed by the visual comfort, associated with air quality.

Security Issues in Internet of Things
Authors:-Hardika Juneja

Abstract- Internet of things is used everywhere in every place today be it home, office or a company at large. Our whole lives are becoming dependant on this emerging technology and we are developing and progressing due to such great advancements in this field. Scientists thought that year 2015 would be an important year marked in the history for the development of IOT but then the increased issues in the security issues of IOT caused this pause in this advancement. Media was already ready to expose the real picture of the security issues behind IOT out in the public, but they were proved wrong. IOT security is a big issue today but at the same time it should not stop you from building your IOT applications, although testing and security has an important role but then we can always look out for feasible solutions rather than stopping the people and ourselves from launching new IOT applications. Security related problems in IOT are an important issue that needs to be solved, we need to find out properly what the problem is and then apply the most effective solution to solve the issue. Here an attempt is made to find out all such problems and then identify their particular solutions. Some security issues that are discussed here are Encrypting the data, Authentication of information, Side-channel Attacks, Hardware Issues, Public Perception and Vulnerability to Hacking.

A Comparative Study on Maximum Power Point Tracking Techniques for Utility Grid Connected Photovoltaic Systems using ANFIS and INC Method
Authors:-M. Tech. Scholar Mr. Aravind Khote, Asst. Prof. Ms.Shalini Goad

Abstract- It is a well-known fact that the dependency of non-renewables sources needs to be reduce to deal with global warming. Solar energy is one such option which is in abundance in India. Solar PV cell are utilised to trap this energy and convert it to electrical energy. The PV cell has the ability to convert near about of 20 % of solar energy to electrical energy. The output of PV cell depends on solar irradiation and panel temperature and panel terminal voltage, based on which MPP can be attained. Hence work is to be done to achieve that point operation for MPPT.This work presents a comparative study between two maximum power point tracking (MPPT) methods in MATLAB/Simulink program that are incremental conductance method and genetic algorithm-based method. The study is performed with variable irradiation and temperature. On simulation, the results obtained are found to give boost converted output voltage of 502.13 V for ANFIS MPPT method and 501.50 V for INC MPPT method. In addition, the output power of boost converter for variable irradiation is found out to be 92.26 KW for ANFIS and 90.41 KW for INC respective only comparison results, ANFIS has clear upper hand over P&O method in terms of performance.

Brain Tumor Detection and Segmentation Using Nobel Approach of Soft Computing
Authors:-Research Scholar Asif Manzoor Qadri, Asst. Prof. Shaveta Bala

Abstract- These days one of the major concerns for human life is the disease of cancer. The growth of cancer patients is increasing day by day. There are many reasons behind the cause. There are two different kinds of brain tumors which are benign type and malignant type. Benign tumor feature is that it increase in size very slowly and do not spread to neighboring tissues while malignant tumors increase is size very fast and possibly spread to other nearby organs. For treatment of brain tumors different methods are used like radiotherapy, chemotherapy and many more. Treatment of brain tumor is dependent on accurate detection, type, age, location, size and experience of physician. In the present proposed work an intelligent system is designed with the use of soft computing techniques to automatically detect brain tumor present in the human brain. The proposed technique will filter the input image and then segments the image. After this process different features are extracted to find whether the tumor is present or not in this image. The proposed technique will be compared with other well known technique to find the worthiness of proposed brain tumor detection technique.

Cloud Computing in Banking Sector – A Case Study
Authors:-Abhishek Oberoi, Yash Dave, Bhargav Patel, Mohammed Anas

Abstract- The advent of cloud computing has changed the way it meets the requirements of IT. Cloud Computing has emerged as a new era in IT and is high on the agenda of all CIOs. Many banks now use cloud technology to achieve their various goals. Cloud technology provides business models that deliver new customer experience, efficient collaboration, improved marketing speed and improved IT efficiency. Using cloud computing banks can create a flexible and fast banking environment that can respond quickly to the needs of a new business. This article provides a useful insight into how cloud computing can be used in the banking industry, the various business models associated with it, and the challenges the banking industry faces in adopting this technology.

Review Article to Road Ways Pavements Design and Soil Penetration Analysis Using FEM
Authors:-M. Tech. Scholar Ritu Bhalavi, Asst. Prof. Mohit Verma

Abstract- Because of a substantial volume of commercial vehicles likely to use facility, the pavement structure has to receive careful consideration in design and choice of materials forming the pavement. Pavement costs constitute a significant proportion of total cost of highway facility. Hence, great care is needed in selecting right type of pavement and specification for the various courses that make up the pavement. The choice of pavement type, whether flexible or cement concrete, therefore, has to be very carefully exercised. Pavement associated traffic safety factors include skid resistance, drainability against hydroplaning, and night visibility. Cement concrete pavement has distinct initial advantage over bitumen pavement in this regard, as surface texturing forms integral part of the normal construction practice for such pavements. They also have superior night visibility by virtue of their lighter colour. Poorly designed and constructed concrete pavements are known to have very long service life. The cement concrete road constructed in the country in the past, though extremely limited in length, have an excellent service track, having given good service under condition much sever than those for which they are originally intended.

Review Paper on Solar Seawater Desalination by Using Reverse Osmosis
Authors:-Prof. K. S. Kamble, Shivam Pawar, Shubham Sawant, Siddhant Narkar, Prathamesh Rane

Abstract- Desalination plants are providing very effective solution to meet the required demand of drinking water from saline water. It focuses on design and modelling of portable solar based Reverse Osmosis (RO) desalination plant. The proposed plant is run by a stand-alone Solar system with battery storage. The total energy requirement of the plant is estimated to predict the capacity of solar panel, sizing the charge controller, power supply, and storage system. Purification of saline water using solar powered desalination methods is an efficient solution to the water scarcity at ships, which represents a promising sustainable solution of desalination plants.

Cloud-Agnostic Solutions for Multi-Biometric Systems: A Java-Based Approach
Authors:-Dr. Vinayak Ashok Bharadi

Abstract- This paper presents a cloud-agnostic architecture for managing multi-biometric systems, focusing on scalability, modularity, and interoperability. Building on Manchana’s 2020 research, the framework leverages Java-based design patterns, including Singleton and Factory, to enable seamless deployment across multiple cloud platforms. The proposed solution facilitates dynamic workload distribution and device management, addressing the challenges of real-time biometric processing in resource-constrained environments. The results demonstrate enhanced scalability and adaptability, with the framework supporting up to 100,000 biometric records and ensuring efficient system performance under high loads.

DOI: 10.61137/ijsret.vol.7.issue5.712

The Concept of ZFS for Long-Term Biomedical Imaging Data Storage

Authors: Chathurika Ranasinghe, Dineth Weerakoon, Malsha Bandara, Thivanka Gunawardana

Abstract: Biomedical imaging systems generate large volumes of sensitive data that must be securely stored, reliably retrieved, and retained for long durations to meet regulatory, clinical, and research demands. ZFS, a high-integrity, copy-on-write file system with integrated volume management, has emerged as a viable solution for long-term imaging storage in healthcare and biomedical research institutions. This review explores the suitability of ZFS for managing medical imaging archives highlighting its built-in features such as end-to-end checksumming, atomic snapshots, native encryption, and tiered storage capabilities. The paper examines ZFS's alignment with regulatory requirements like HIPAA, GDPR, and FDA 21 CFR Part 11, and discusses how its auditability, snapshot lifecycle management, and disaster recovery features help ensure compliance and data integrity. We delve into ZFS performance tuning for imaging workloads, including optimizations using ARC, L2ARC, SLOG, and record size configuration, which are critical for high-throughput radiology and pathology systems. Integration with PACS, RIS, and AI processing pipelines is reviewed, along with real-world deployments in clinical and research environments. Operational challenges such as resource overhead, secure deletion limitations, and administrative complexity are addressed, alongside emerging trends like object storage extensions, support for storage-class memory, and container-native workflows. Through this comprehensive review, ZFS is positioned not only as a technically robust and scalable imaging storage platform, but also as a strategic foundation for future-proof, compliant biomedical data management.

DOI: https://doi.org/10.5281/zenodo.15847617

Evaluating The Impact Of Remote Product Teams On Software Delivery Timelines: A Case Study Of U.S. SaaS Companies Post-2020

Authors: Omon ENI, Arun K Menon

Abstract: The COVID-19 pandemic fundamentally transformed the operational landscape of U.S. Software-as-a-Service (SaaS) companies, forcing rapid adoption of remote-first product management practices. This study examines the impact of distributed product teams on software delivery timelines through a comprehensive analysis of 127 U.S.-based SaaS companies that transitioned to remote operations between March 2020 and December 2021. Using mixed-methods research combining quantitative performance metrics and qualitative interviews with product managers, this investigation reveals significant variations in delivery performance based on organizational adaptation strategies, communication frameworks, and asynchronous workflow implementations. Key findings indicate that companies implementing structured asynchronous decision-making processes experienced 23% faster feature delivery times, while organizations lacking formal remote collaboration frameworks saw 31% longer development cycles. These results contribute to the growing body of literature on distributed software development and provide actionable insights for product management practitioners navigating the post-pandemic digital workplace.

DOI: http://doi.org/10.5281/zenodo.17044243

Optimizing Hybrid Unix CRM Infrastructure Using Salesforce Flows, Omni-Channel Automation, And AI-Driven Service Intelligence

Authors: Gurpal Mann

Abstract: Hybrid Customer Relationship Management (CRM) infrastructures are increasingly critical in enterprises that balance cloud agility with on-premise reliability. This review examines the role of Salesforce Flows, Omni-Channel automation, and AI-driven service intelligence in optimizing CRM operations within hybrid Unix/Linux environments. It highlights how Salesforce Flows streamline cross-platform workflows, how Omni-Channel automation enables unified and consistent customer engagement, and how AI enhances decision-making through predictive analytics and autonomous orchestration. Integration frameworks, performance optimization strategies, and real-world industry applications in finance, healthcare, retail, and telecommunications are explored in depth. A comparative analysis of Salesforce against other CRM platforms such as Microsoft Dynamics 365, Oracle CX Cloud, and SAP Customer Experience underscores Salesforce’s flexibility and forward-looking AI capabilities. The review also discusses future trends, including self-healing systems, zero-trust security, and generative AI, which will further shape the evolution of hybrid CRM environments. Ultimately, the study demonstrates that enterprises leveraging Salesforce’s automation and AI capabilities alongside Unix/Linux reliability can achieve secure, scalable, and customer-centric CRM infrastructures.

DOI: https://doi.org/10.5281/zenodo.17368364

 

AI-Powered CTI And Salesforce Omni-Channel Integrated With Hybrid Unix Systems For Seamless Enterprise Communication Flows

Authors: Balvinder Deol

Abstract: In today’s enterprise landscape, seamless and intelligent communication flows are critical for delivering superior customer experiences. This review examines the integration of AI-powered Computer Telephony Integration (CTI) and Salesforce Omni-Channel with hybrid Unix/Linux infrastructures to achieve secure, scalable, and context-aware customer engagement. It highlights how CTI has evolved from basic telephony management to AI-driven workflows incorporating speech recognition, sentiment analysis, and predictive routing. Salesforce Omni-Channel is explored as a unified engagement hub that orchestrates voice and digital interactions across multiple channels, ensuring consistency and efficiency. The role of Unix/Linux systems as reliable, secure backends supporting telephony services and middleware integration is emphasized, particularly in hybrid architectures.The article discusses middleware and API-driven frameworks as enablers for interoperability, while addressing performance optimization strategies such as load balancing, elastic scaling, and AI-driven orchestration. Industry applications in finance, healthcare, retail, and telecommunications are examined, illustrating real-world benefits of these integrations. Comparative analysis with other CRM platforms—Microsoft Dynamics 365, Oracle CX Cloud, and SAP Customer Experience—underscores Salesforce’s strengths in flexibility, AI capabilities, and hybrid adaptability. Future research directions include the adoption of generative AI, autonomous self-healing communication systems, edge computing for real-time optimization, and security-first communication models. By combining Salesforce’s cloud-native intelligence with Unix/Linux reliability, enterprises can deliver customer-centric communication flows that are resilient, secure, and adaptive to evolving business needs.

DOI: https://doi.org/10.5281/zenodo.17368515

 

Implementing Apache Tomcat And JBoss Middleware For Salesforce AI Agents Across Hybrid Multi-Cloud Enterprise Environments

Authors: Tejinder Sandhu

Abstract: The integration of Salesforce AI agents across hybrid multi-cloud environments is redefining the enterprise Customer Relationship Management (CRM) landscape. Middleware solutions, particularly Apache Tomcat and JBoss, play a critical role in enabling seamless interoperability between Salesforce’s AI-driven services and diverse enterprise systems hosted on Unix/Linux and cloud infrastructures. This review explores how Tomcat’s lightweight architecture and JBoss’s enterprise-grade features collectively support API management, workflow orchestration, transaction integrity, and scalability. It also examines performance optimization strategies, industry-specific applications, and comparative insights with alternative middleware platforms such as MuleSoft, WebSphere, and Apache Kafka. Furthermore, the study highlights future directions, including AI-driven orchestration, edge computing integration, generative AI for middleware automation, and security-first architectural models. By providing a comprehensive analysis, this review underscores how middleware technologies are foundational for deploying Salesforce AI agents in complex enterprise ecosystems, ultimately enabling organizations to achieve resilience, compliance, and customer-centric innovation in the digital age.

DOI: https://doi.org/10.5281/zenodo.17368656

 

Unlocking Synergies Between AI-Powered Salesforce CRM Engineering And Traditional Unix/Linux Hybrid Infrastructure For Enterprise Growth

Authors: Gopal Sehrawat

Abstract: The rapid evolution of enterprise IT demands solutions that combine innovation with stability, intelligence with security, and customer engagement with operational efficiency. This review explores the convergence of AI-powered Salesforce Customer Relationship Management (CRM) platforms with traditional Unix/Linux hybrid infrastructures, highlighting how enterprises can unlock synergies to drive sustainable growth. Salesforce CRM, augmented by artificial intelligence, provides predictive analytics, intelligent automation, and personalized customer experiences. Unix/Linux, long valued for its reliability, scalability, and compliance-ready frameworks, continues to power mission-critical systems across industries such as finance, healthcare, retail, and manufacturing. The integration of these two domains creates a hybrid ecosystem where Salesforce delivers intelligent front-end capabilities while Unix/Linux ensures robust back-end processing and governance. The article examines technical challenges including legacy compatibility, data synchronization, and regulatory compliance, before presenting strategic frameworks such as architectural blueprints, governance models, automation-driven orchestration, and cloud–on-premises balance. Case studies illustrate how different industries leverage this synergy for measurable business value. Future trends—edge computing, quantum-safe cryptography, AI-driven automation, and containerized microservices—are identified as critical enablers for next-generation hybrid ecosystems. By aligning AI-powered Salesforce CRM with Unix/Linux infrastructures, enterprises can enhance customer engagement, optimize operations, and maintain compliance while future-proofing their digital strategies.

DOI: https://doi.org/10.5281/zenodo.17519957

 

Leveraging Red Hat Satellite And Salesforce Einstein Copilot For Secure, Scalable Hybrid Cloud CRM Automation Environments

Authors: Anjali Kathuria

Abstract: The convergence of Red Hat Satellite and Salesforce Einstein Copilot offers enterprises a transformative approach to hybrid cloud CRM environments, combining robust infrastructure management with AI-driven customer engagement. Red Hat Satellite provides centralized provisioning, configuration, patching, and lifecycle management for Linux-based servers, ensuring security, compliance, and operational resilience across on-premises and cloud platforms. Salesforce Einstein Copilot delivers predictive analytics, workflow automation, and personalized CRM insights, enabling proactive and intelligent customer engagement. This review explores architectural synergies, automation frameworks, security considerations, and performance optimization strategies necessary for integrating these technologies within hybrid cloud ecosystems. Real-world applications across finance, healthcare, retail, and manufacturing illustrate measurable improvements in operational efficiency, regulatory compliance, and customer satisfaction. Challenges such as legacy system integration, data synchronization, multi-cloud security risks, and AI workload management are analyzed alongside strategic frameworks for seamless integration, governance, and orchestration. The findings highlight that hybrid CRM environments leveraging Red Hat Satellite and Salesforce Copilot can achieve scalable, secure, and automated operations while maintaining high availability and cost-efficiency. Emerging trends in AI, edge computing, and self-healing infrastructure are expected to further enhance these ecosystems, providing enterprises with a blueprint for sustainable digital transformation, innovation, and growth.

DOI: https://doi.org/10.5281/zenodo.17520055

 

AIX, Solaris, And Modern Linux: Building Future-Ready Infrastructure For Salesforce LWC And AI-Enhanced Cloud Experiences

Authors: Rajat Bhardwaj

Abstract: – Enterprises are increasingly adopting hybrid IT architectures that combine legacy UNIX/Linux systems with cloud-based CRM platforms and AI-driven workflows. AIX, Solaris, and modern Linux distributions provide reliability, scalability, and security for mission-critical applications, while Salesforce Lightning Web Components (LWC) and AI-enhanced services such as Salesforce Einstein enable intelligent customer engagement, predictive analytics, and workflow automation. This review explores strategies for integrating these technologies, focusing on architectural models, performance optimization, security, compliance, and automation frameworks. Case studies across finance, healthcare, retail, and manufacturing illustrate practical applications, highlighting both operational benefits and technical challenges. Emerging trends, including edge computing, self-healing systems, AI-driven infrastructure optimization, and quantum-safe security, are examined to provide future-ready guidance. The review emphasizes how enterprises can leverage hybrid integration to achieve scalable, secure, and intelligent CRM environments, fostering innovation, operational resilience, and enhanced customer experiences.

AI-Powered Clinical Decision Support Systems Using Physiological Data From Connected Medical Devices

Authors: Shaurya Tomar

Abstract: The integration of Artificial Intelligence (AI) with the Internet of Medical Things (IoMT) has birthed a new generation of Clinical Decision Support Systems (CDSS) capable of real-time physiological monitoring. This review article examines the architectural and methodological shift from rule-based alerts to predictive AI engines that process high-frequency data from connected medical devices. We investigate the core pipeline of these systems—from signal denoising at the Edge to deep learning-based feature extraction in the Cloud—and evaluate how these technologies address the "data deluge" currently overwhelming clinical staff. The article provides a detailed taxonomy of AI methodologies, including Supervised Learning for diagnosis, Reinforcement Learning for treatment optimization, and the rising role of Explainable AI (XAI) in fostering clinician trust. Key clinical use cases are explored, ranging from early sepsis detection in the ICU to the management of chronic conditions like diabetes through closed-loop artificial pancreas systems. Furthermore, we address the critical barriers to adoption, specifically focusing on data quality, clinical alarm fatigue, and the "interoperability gap" between siloed medical systems. Finally, the review analyzes the 2025 regulatory landscape, including the impact of the EU AI Act and the FDA's evolving SaMD guidelines. We conclude that while AI-powered CDSS offers unprecedented potential for proactive care, its success depends on maintaining a "Human-in-the-Loop" approach, ensuring that AI augments rather than replaces clinical expertise.

DOI: http://doi.org/10.5281/zenodo.18159591

Optimizing Enterprise Resource Planning Performance Through Machine Learning–Based Predictive Maintenance Models

Authors: Navya Kulshreshtha

Abstract: The rapid evolution of Industry 4.0 has necessitated a transition from traditional administrative Enterprise Resource Planning (ERP) to "Intelligent ERP" systems that leverage real-time operational data. This review article investigates the optimization of ERP performance through the integration of Machine Learning (ML)–based Predictive Maintenance (PdM) models. While traditional maintenance strategies within ERP namely reactive and preventive often lead to unplanned downtime or resource wastage, ML-based PdM offers a data-driven alternative that predicts equipment failure before it occurs. This study synthesizes current literature regarding the architectural integration of Industrial Internet of Things (IIoT) sensors with ERP modules, such as Asset Management, Production Planning, and Materials Management. We categorize the predominant ML methodologies including Supervised Learning for fault classification, Deep Learning (LSTM and GRU) for Remaining Useful Life (RUL) estimation, and Unsupervised Anomaly Detection evaluating their specific contributions to enterprise-level efficiency. The review highlights how PdM-driven insights directly optimize ERP Key Performance Indicators (KPIs) by reducing maintenance costs, streamlining spare parts inventory through Just-in-Time (JIT) procurement, and enhancing Overall Equipment Effectiveness (OEE). Furthermore, the article addresses critical implementation challenges, such as data silos, scalability, and the "black box" nature of AI models. By analyzing the synergy between predictive analytics and resource orchestration, this review provides a roadmap for researchers and practitioners to build resilient, self-optimizing industrial ecosystems. The findings suggest that the integration of ML-PdM is no longer a peripheral technical upgrade but a core strategic necessity for modern enterprise resource management, enabling a shift from descriptive reporting to prescriptive action.

DOI: http://doi.org/10.5281/zenodo.18159637

A Conceptual Framework For Managing Invisible Risks In Cloud-Enabled Internet Of Things Environments

Authors: Kabir Sehgal

Abstract: The seamless integration of the Internet of Things (IoT) with Cloud Computing has revolutionized data-driven ecosystems, yet it has simultaneously birthed a sophisticated class of "Invisible Risks." Unlike traditional cyber threats that target known software vulnerabilities or hardware weaknesses, invisible risks emerge from the systemic complexity, algorithmic opacity, and "gray-zone" interactions inherent in distributed architectures. These risks including data shadowing, logic flaws in cross-protocol interoperability, and the silent propagation of algorithmic bias—often bypass conventional signature-based detection systems, remaining latent until they manifest as catastrophic failures. This review article proposes a comprehensive Conceptual Framework for Managing Invisible Risks by synthesizing multi-disciplinary research across cybersecurity, system engineering, and cognitive psychology. We categorize these risks across a four-tier architecture: the Perception, Network, Cloud, and Application layers. Each layer is analyzed to identify the "invisibility triggers" that obscure threat vectors from administrative oversight. Furthermore, the paper evaluates contemporary risk assessment methodologies, advocating for a transition from static monitoring to dynamic observability through the use of Bayesian Networks, Digital Twins, and Chaos Engineering. We propose a proactive management strategy anchored by three pillars: Zero Trust Architecture (ZTA), AI-driven Automated Governance, and Edge Intelligence. The framework aims to bridge the "transparency gap" in Cloud-IoT environments, providing researchers and practitioners with a structured roadmap to identify, quantify, and mitigate hidden threats. Finally, the article discusses future directions, including the role of blockchain for provenance and quantum-resistant cryptography, emphasizing that the future of Cloud-IoT security depends on our ability to make the invisible visible.

DOI: http://doi.org/10.5281/zenodo.18159648

Implementing High-Performance Data Integration Pipelines For Analytics And Reporting In Complex Enterprise Landscapes

Authors: Nagender Yamsani

Abstract: High-performance analytics and reporting within large enterprises depend on data integration pipelines that can operate reliably across fragmented operational systems, governance boundaries, and performance constraints. As organizations expand their digital footprints, analytical workloads increasingly rely on structured data access mechanisms that balance scalability, control, and responsiveness. This study examines the design and implementation of enterprise data integration pipelines that support analytics and reporting in complex operational environments. It focuses on the interaction between API-mediated data access, SQL-based service layers, and transformation workflows that mediate between transactional systems and analytical consumers. The paper argues that sustainable analytics capability emerges from architectural coherence rather than isolated tooling choices. Evidence from large-scale enterprise environments suggests that pipelines emphasizing modular integration layers, performance-aware data transformations, and governed access models achieve higher analytical reliability and operational resilience. Empirical patterns indicate that separating data exposure concerns from transformation logic improves system adaptability while reducing downstream reporting volatility. The study introduces a conceptual framework that aligns integration architecture, operational performance controls, and governance enforcement into a unified model for enterprise analytics enablement. By articulating practical design trade-offs and architectural patterns grounded in real operational constraints, this work contributes a structured perspective that supports both applied implementation and future academic inquiry. The findings provide a foundation for understanding how disciplined integration engineering can enhance analytical trust, scalability, and long-term maintainability in enterprise reporting systems.

Automated Classification of Large-Scale Network Configurations Using Machine Learning and Semantic Vectorization

Authors: Narendra Reddy Burramukku

Abstract: The rapid expansion of large-scale computer networks has introduced significant complexity in managing diverse network configurations. Manual classification and analysis of configurations are time-consuming, error-prone, and increasingly infeasible in dynamic environments. This paper presents a novel framework for automated classification of large-scale network configurations using machine learning combined with semantic vectorization. Network configuration files are first pre-processed and transformed into high-dimensional vector representations that capture both semantic and hierarchical relationships among configuration commands, protocols, and policies. These embeddings serve as input to supervised machine learning models, including Random Forest, Support Vector Machines, and Neural Networks, enabling accurate classification of network devices, roles, and compliance profiles. Experiments are conducted on real-world enterprise, cloud, and synthetic network datasets, comprising thousands of configuration files with diverse structures and device types. Results demonstrate that the proposed framework significantly outperforms traditional rule-based and feature-based approaches, achieving up to 94.5% F1-score with graph-based embeddings. Scalability analysis indicates the method can efficiently handle large volumes of configurations while maintaining high accuracy. The study highlights the effectiveness of semantic vectorization in capturing complex configuration semantics and facilitating robust automated classification. This framework provides a foundation for intelligent, scalable network management, supporting proactive policy enforcement, misconfiguration detection, and operational efficiency. Future work explores real-time classification, integration with network orchestration systems, and transformer-based embeddings for richer semantic representation.

DOI: https://doi.org/10.5281/zenodo.18383730

 

Cloud-Native Network Monitoring: Tools, Architectures, And Best Practices

Authors: Narendra Reddy Burramukku

Abstract: Cloud-native networking has transformed modern enterprise and service provider infrastructures by enabling highly dynamic, scalable, and distributed environments based on microservices, containers, and multi-cloud deployments. While these architectures improve agility and resource efficiency, they also introduce significant challenges in maintaining visibility, performance assurance, and security. Traditional network monitoring approaches are inadequate for handling ephemeral workloads, high-velocity telemetry, and complex inter-service communications. This paper presents a comprehensive review of cloud-native network monitoring, focusing on monitoring tools, architectural frameworks, and operational best practices suitable for modern cloud-native ecosystems. It systematically analyzes open-source and commercial monitoring solutions, including Prometheus, Grafana, OpenTelemetry, ELK Stack, and cloud-provider-native platforms, highlighting their roles in metrics collection, logging, and distributed tracing. The study further examines key architectural models such as centralized, distributed, and hybrid monitoring frameworks, as well as agent-based and agentless approaches, emphasizing scalability, fault tolerance, and integration with orchestration platforms like Kubernetes. Best practices for observability design, metric selection, alerting, and automated incident management are discussed in the context of DevOps and Site Reliability Engineering (SRE). Additionally, the paper identifies critical challenges related to scalability, hybrid and multi-cloud observability, security, and privacy, while outlining emerging research directions including AI/ML-driven monitoring, autonomous remediation, and edge observability. By consolidating tools, architectures, and operational strategies, this paper provides a structured reference for researchers and practitioners seeking to design, deploy, and optimize effective cloud-native network monitoring systems.

 

 

Distributed System Automation Using Infrastructure-As-Code And CI/CD

Authors: Meera Krishnan

Abstract: Distributed systems have evolved into the foundational infrastructure supporting modern digital services, enabling cloud-native applications, microservices-based architectures, big data platforms, and globally distributed enterprise ecosystems. By leveraging geographically dispersed computing resources, distributed systems provide scalability, high availability, and fault tolerance. However, as system scale and architectural complexity increase, operational management becomes significantly more challenging. Organizations must address issues related to dynamic resource provisioning, configuration consistency, dependency management, automated scaling, continuous updates, and security enforcement across heterogeneous environments. Traditional manual administration approaches are insufficient for handling such complexity, often leading to configuration drift, deployment failures, environment inconsistencies, and increased operational risk. To overcome these limitations, automation-driven paradigms such as Infrastructure-as-Code (IaC) and Continuous Integration/Continuous Deployment (CI/CD) have emerged as essential components of modern distributed system management. Infrastructure-as-Code transforms infrastructure provisioning and configuration into machine-readable, version-controlled definitions, enabling reproducibility, consistency, and rapid environment replication. Simultaneously, CI/CD frameworks automate application build, testing, validation, and deployment processes, ensuring continuous delivery of reliable software updates across distributed architectures. The integration of IaC and CI/CD establishes a unified automation pipeline in which infrastructure and application lifecycles are managed cohesively, promoting operational efficiency, traceability, and resilience. This review comprehensively examines the conceptual foundations, architectural frameworks, and practical implementations of integrating IaC with CI/CD for distributed system automation. It analyzes declarative and imperative infrastructure models, automated deployment strategies, immutable infrastructure principles, and cloud-native orchestration practices. Furthermore, the paper evaluates the operational benefits of automation—including scalability optimization, reduced configuration drift, accelerated recovery, enhanced collaboration, and improved compliance management—while critically assessing associated challenges such as state management complexity, security vulnerabilities in automation scripts, pipeline debugging difficulties, and cost governance concerns. In addition, emerging paradigms such as GitOps, policy-as-code, DevSecOps, AI-driven pipeline optimization, and self-healing infrastructure mechanisms are discussed to highlight the ongoing evolution toward intelligent and autonomous system management. By synthesizing current practices and research directions, this review provides a structured perspective on how integrated automation frameworks enhance reliability, scalability, and security in distributed environments, while outlining future research opportunities aimed at achieving more adaptive, predictive, and cost-efficient distributed system operations.

DOI: http://doi.org/10.5281/zenodo.18677076

Enterprise-Scale Application And Network Modernization Strategies

Authors: Vivek Menon

Abstract: Enterprise-scale modernization has evolved from a strategic option to an operational imperative in the contemporary digital economy. Organizations that continue to rely on legacy applications and rigid, hardware-centric network infrastructures face mounting challenges in sustaining competitiveness, operational efficiency, and security resilience. Rapid technological innovation, evolving customer expectations, intensifying cloud-native competition, and increasingly sophisticated cyber threats are collectively reshaping the enterprise IT landscape. Systems originally designed for stability and centralized control now struggle to support modern requirements such as real-time analytics, elastic scalability, distributed workforce enablement, continuous deployment cycles, and AI-driven automation. As a result, modernization initiatives are becoming foundational to long-term enterprise sustainability and growth.This review provides a comprehensive examination of enterprise modernization strategies across both application and network domains. On the application side, modernization approaches such as cloud migration, microservices adoption, API-first design, containerization, DevOps integration, and Infrastructure as Code (IaC) are analyzed for their impact on scalability, agility, and maintainability. Transitioning from monolithic architectures to modular, loosely coupled systems enables organizations to accelerate innovation cycles, improve fault isolation, and enhance operational efficiency. Simultaneously, adopting cloud-native frameworks facilitates resource elasticity, cost optimization, and global service delivery.From a networking perspective, the paper explores the transformation from traditional perimeter-based infrastructures to software-defined networking (SDN), software-defined wide area networking (SD-WAN), and Zero Trust security architectures. These paradigms introduce centralized control, programmable network policies, identity-based access enforcement, and continuous monitoring capabilities. By decoupling control and data planes and embedding security mechanisms directly into network layers, enterprises can enhance visibility, reduce lateral threat movement, and support distributed cloud environments.Furthermore, the review evaluates automation-driven infrastructure and AI-enabled operations (AIOps) as critical enablers of modernization at scale. Automated provisioning, predictive monitoring, anomaly detection, and self-healing systems reduce operational complexity while improving service reliability. Governance frameworks, compliance integration, risk mitigation strategies, and cultural transformation are also discussed as essential components of successful modernization initiatives.The paper highlights both the tangible benefits—such as improved agility, cost reduction, resilience, and competitive advantage—and the inherent technical and organizational challenges associated with modernization, including data migration complexity, legacy integration risks, skill gaps, and change resistance. Finally, emerging trends such as AI-native architectures, edge computing integration, 5G-enabled connectivity, platform engineering, and sustainable green IT practices are examined as shaping forces of next-generation enterprise IT ecosystems.Overall, enterprise-scale modernization is framed not merely as a technological transition but as a strategic, organizational transformation that redefines how enterprises design, secure, deploy, and manage digital systems in an increasingly complex and interconnected world.

DOI: http://doi.org/10.5281/zenodo.18677080

AI-Powered Identity And Access Management Systems

Authors: Elena Volkova

Abstract: In the modern era of decentralized workforces and cloud-native architectures, the traditional perimeter-based security model has collapsed, giving way to identity as the new primary security boundary. Identity and Access Management (IAM) systems are now the gatekeepers of enterprise resources, yet they face an unprecedented volume of sophisticated attacks, ranging from credential stuffing to advanced social engineering. This review examines the paradigm shift toward AI-Powered Identity and Access Management Systems. By integrating Machine Learning (ML) and Deep Learning (DL) algorithms, modern IAM frameworks have transitioned from static, rule-based engines to dynamic, risk-aware ecosystems. These systems leverage User and Entity Behavior Analytics (UEBA) to establish granular baselines of normal activity, allowing for the real-time detection of anomalies that signal compromised credentials or insider threats. This article categorizes current AI methodologies, including the use of neural networks for biometric authentication and reinforcement learning for adaptive access control policies. We explore how AI mitigates "entitlement creep" and automates the complex lifecycle of identity governance. Furthermore, the review addresses the integration of AI within Zero Trust Architectures (ZTA), where continuous authentication replaces the "authenticate once, access forever" model. By synthesizing recent research and industrial deployments, this paper provides a strategic roadmap for the next generation of identity security. The findings suggest that while AI significantly enhances the precision of access decisions, its success depends on data privacy, model transparency, and resilience against adversarial manipulation.

DOI: https://doi.org/10.5281/zenodo.19491983

AI-Augmented Zero Trust Security Architectures

Authors: Tharushi Silva

Abstract: The rapid evolution of cyber threats, coupled with the increasing complexity of distributed computing environments, has necessitated a paradigm shift in enterprise security strategies. Zero Trust Security Architecture (ZTSA), which operates on the principle of “never trust, always verify,” has emerged as a robust framework to mitigate modern attack vectors. However, traditional Zero Trust implementations often struggle with scalability, dynamic policy enforcement, and real-time threat adaptation. The integration of Artificial Intelligence (AI) into Zero Trust frameworks introduces a transformative approach by enabling adaptive, context-aware, and predictive security mechanisms. AI-augmented Zero Trust architectures leverage machine learning, behavioral analytics, and automation to continuously evaluate trust levels, detect anomalies, and enforce granular access controls. This review explores the convergence of AI and Zero Trust, highlighting architectural components, implementation strategies, and challenges. It further examines how AI enhances identity verification, network segmentation, and threat intelligence, while addressing issues such as data privacy, model bias, and operational complexity. By synthesizing current research and industry practices, this article presents a comprehensive overview of AI-driven Zero Trust systems and their role in securing next-generation digital infrastructures.

DOI: https://zenodo.org/records/19491997

Published by:

Cloud-Agnostic Solutions for Multi-Biometric Systems: A Java-Based Approach

Uncategorized

Cloud-Agnostic Solutions for Multi-Biometric Systems: A Java-Based Approach
Authors:-Dr. Vinayak Ashok Bharadi

Abstract- This paper presents a cloud-agnostic architecture for managing multi-biometric systems, focusing on scalability, modularity, and interoperability. Building on Manchana’s 2020 research, the framework leverages Java-based design patterns, including Singleton and Factory, to enable seamless deployment across multiple cloud platforms. The proposed solution facilitates dynamic workload distribution and device management, addressing the challenges of real-time biometric processing in resource-constrained environments. The results demonstrate enhanced scalability and adaptability, with the framework supporting up to 100,000 biometric records and ensuring efficient system performance under high loads.

DOI: 10.61137/ijsret.vol.7.issue5.712

Published by:

Revisiting Factor Models After 2020: Machine Learning, Factor Stability, and Investment Performance

Uncategorized

Authors: Oksana Anatolyevna Malysheva

Abstract: The new financial market environment after 2020, due to the COVID-19 shock, unprecedented monetary interventions, and increased macroeconomic uncertainty has cast new doubt on the reliability and persistence of the traditional models of asset pricing factors. Although classical factor models have traditionally been the basis of portfolio construction and the management of risk, there has been mounting evidence that factor stability can be lost in the face of structural regime changes. The paper reexamines previous post-2020 period factor models with specific focus on occupation of factor permanency and incremental importance of machine learning methods in explaining and predicting investment outcomes. The main aim of this paper is to evaluate whether the traditional risk factors hold their values and are economically significant beyond 2020 and to test the possibility of machine learning-based methods to predict better than traditional linear models to forecast the portfolio performance and results. The study builds standard factor portfolios with a wide equity universe over the post-2020 time sample, and it compares their performance to machine learning-based models that help to identify the nonlinear links and time-varying interactions between firm characteristics. The analysis methodology will be a combination of benchmark linear factor regressions with supervised machine learning algorithms, such as ensemble-based algorithms, applying consistent training and validation to these algorithms. Factor stability is determined with the help of rolling-window estimation and structural change analysis and investment performance with the help of risk-adjusted returns measures and transaction cost-adjusted portfolio performance. The results show that the stability and persistence of a number of conventional factors significantly decreased after 2020 and became more sensitive to market regimes. The models of machine learning have shown greater out-of-sample, and risk-adjusted returns, and the returns are not uniform across factors. The research provides empirical data on post-crisis factor behavior and provides a practical direction of applying machine learning in the integration of factored investment strategies in changing market conditions.

Published by:

How much does it cost to publish a paper in a journal

Uncategorized

Research scholars pursuing master’s or doctorates degree have to publish their research work in some reputed international journal. But most scholars have no funds from college or universities to complete their degree work, hence worried for How much does it cost to publish a paper in a journal. To resolve such doubts this article help to understand few of below question-related to the publication.

How much does it cost to publish a research paper in India ?

Normal International journals cost 1000 to 2500 Rs, as per indexing journal charges get increases to 10,000Rs to 25,000 Rs. Although some journals are free but publishing a paper for beginners is tough.

Cost of publishing a research paper in India ?

Submit Paper Now

Paper Publication Charges

So cost of publication range from zero rupees to 30,000 Rs as per indexing of sites.

How much does it cost to submit a paper to a journal?

Submitting a paper in almost all journals is free, but some journals charge nominal fees of 300Rs to 1500Rs. Paid submission journals just wants to restrict authors to submit good content only.

How much does it cost to publish a paper?

Publishing a paper includes publication charges or article processing charges, certificate charges (if applicable), hard copy charges (Optional), formatting or language improvement charges. So the sum of all types of charges is the total cost of the paper publication.

How much does it cost to publish a research paper?

Publishing a research paper or review paper is not vary. It totally depends on where you want to publish a paper.

How much does it cost to publish a scientific paper?

Any kind of research material has some comparison in terms of a scientific paper, publication of such kind of content is also depend on publisher policy. The cost normally varies from zero USD to 2200 USD.

I hope this article help scholar to answers of how much it cost to publish a research paper in the journal, before submitting a paper consult your guide/mentor so his experience helps you to get the good publication in less price and time. Scholars should prepare a good paper with abstract having clear understanding of work with conclusion showing output of paper or research. People need low publication journal can directly submit paper at ijsret.com@gmail.com.

Published by:

Investigation of Co-Pyrolysis of Plastic and Cocos Nocifera

Uncategorized

Investigation of Co-Pyrolysis of Plastic and Cocos Nocifera
Authors:- S. Ajith kumar, Janobin S, Halin Heijal A, Bisel M

Abstract- Generally all over the world, the usage of fossil fuels in the day to day life as liquid, gas and the solid have been highly increased. Nowadays the population is getting highly increased and at the same time, the usage of biomass is also increasing. Biomass is an alternate fuel from the starting era of biotic organism. The usage of plastic also increased. The disposal and recycling of plastic is facing a challenging task. The researchers are facing high tedious problem in destroying plastic. Since the decomposition of plastic is highly dangerous to the soil. Pyrolysis process is the most promising for the waste to useful form of energy conversion techniques. While destroying biomass and plastic, more carbon footprints are seen. End product of plastic and biomass pyrolysis has its own deficiency in its characterization. In order to overcome these drawbacks, biomass and plastic is combined together as mass quantity and undergoes pyrolysis process for finding the better result. The pyrolysis end product properties like ash content, moisture content, kinematic viscosity, density, flash point, fire point, cloud point, pour point, gross calorific value and specific gravity are determined.

DOI: 10.61137/ijsret.vol.7.issue4.648

Published by:

Dynamic Mechanical Analysis of Discarded Polyurethane with Polyester Hybrid Composites

Uncategorized

Dynamic Mechanical Analysis of Discarded Polyurethane with Polyester Hybrid Composites
Authors:- E. Bravin Daniel, Ajin B S, Ajun Jijo S, Berin J

Abstract- A composite material is a combination of two materials with different physical and chemical properties. When they are combined they create a material which is specialised to do a certain job, for instance to become stronger, lighter or resistant to electricity. They can also improve strength and stiffness. Polymer matrix composites (PMCs) are the most widely used composite type. Just as is the case for polymers, the environment typically needs to be controlled to obtain consistent test results for PMCs Polymer matrix composites (PMCs) have gained considerable interest mainly due to their low cost and higher specific strength and stiffness compared to conventional metallic alloys. This prepared composites were evaluated by dynamic mechanical analysis method such as dynamic mechanical analysis, X-Ray diffraction, Fourier transform infrared. The aim is to conduct the dynamic mechanical analysis of polyurethane and polyester composite. Polyurethane is the thermosetting polymer composite which can withstand high temperature. On the other hand polyester is the fiber material with ductile nature. After the preparation of the composites ,mechanical analysis of the specimen is to be analyzed.

DOI: 10.61137/ijsret.vol.7.issue4.647

Published by:
× How can I help you?