IJSRET » Blog Archives

Author Archives: vikaspatanker

Paper Publishing Journals

Journal to publish paper

Scholars who are searching for international journal in order to publish their research, survey, review paper always get confuse as number of publishers are available. Here this article help these young researcher to get an clear understanding of the paper publication. Paper publication author searching for journals always in hurry so people should consult there guide before submission of research work. Some journal need paper submission in format as paper journal format. So paper publishing journals should have number of features like:

Submit Your Paper  / Check Publication Charges

1. ISSN number
2. Good Impact Factor
3. Paper publication in each issue
4. Peer Review Team
5. Responsive Publisher Team
6. Digital Certificates for Publication

Young authors who recently start there research need to check sites having above points. IJSRET provide publication which have all necessary features having valid ISSN number, authorize Impact Factor, Chat Option, etc. Research author who are need of fast and low publication journal should cross check there paper under following points:

Paper content should have clear understand of research topic.
• Tables used in paper should have proper title and use in paper.
• Graph used in paper should have proper title and use in paper.
• Figures used in paper should have proper title and use in paper.

It is also suggested to follow the reviewer comment if modifications are suggested in the mail after peer review team. Author will get good response from this journal team for publishing there paper. Reviewer comment mention some of valid points of research field which improve quality of work as well. Paper formatting is done by internal team of IJSRET international journal. All authors will get digital certificates with paper link in respected issue of current volume. Hence Paper publishing journals are easy for young researcher communication steps as well.

Published by:

Call For Papers In Journals 2024

Uncategorized

Submit Your Paper  / Check Publication Charges

Number of journals always invite authors to publish paper in their upcoming volume or issue. So author need to check various factors for the journals such as

• International Journal Impact Factor
• Indexing of the International Journal
• Paper publication cost
• Authors Guideline
• Upcoming publication date

So call for paper in journals are done editor using various means like by conducting International, National conference. Author who are looking for Engineering Science and Technology an International Journal Impact Factor above 2.5 can submit their paper at: https://ijsret.com/paper-submission/ One can submit paper for easy and fast publication. It is suggested for author to consult paper content with all other author before publication. Student, professional and professor can publish their research work in all set of Engineering, science fields.

Published by:

Call for papers free publication

Journal to publish paper

Submit Your Paper  / Check Publication Charges

Scholars who are working to complete their research work can publish paper in good peer reviewed international journal. Here it was seen that researchers who are young always looking for publications in publication in fast and low price international journal. So keeping this goal I suggest authors to recheck paper content before submitting paper at https://ijsret.com/paper-submission/. This international journal which Call For Papers 2024, provide following benefits:

1. Get Email updates for each step of paper.
2. Review process is fast if author required fast response.
3. Instant Paper publication is done once paper get accepted by reviewer.
4. Each author will get separate digital certificates.
5. Paper formatting is done by journal internal team.

call for paper free publication

Following are the steps of paper publication:
1. Submit your paper at:
2. After submission author will get submission mail.
3. During Review process author have to wait for few days.
4. Once review process gets over than acceptance or paper improvement mail will be send with some suggestion.
5. Authors have to complete reviewer suggestion, submission of copyright form, etc.

Published by:

call for papers computer science

Uncategorized

Submit Your Paper  / Check Publication Charges

Researcher who are looking for publications in 2024 year can submit their paper on https://ijsret.com/paper-submission/. So this journal call for papers in 2024. Here authors are suggested to check below list of paper writing parameters as this reduce review process and publication time.

 

  1. Check paper title as it should be unique.
  2. Carefully write abstract and conclusion part of your paper.
  3. Remove plagiarism from the paper.
  4. Each graph and table should have proper title in the paper with clear use of each.
  5. All references of this paper should be used in the paper.

 

After submission author will get submission mail. Here after submission paper is under review peer review process. Author have to wait for few days. Once review process gets over than acceptance or paper improvement mail will be send with some suggestion. It is suggested to take quick action after this mail, such as improvement in paper as per reviewer comment, submission of copyright form, etc.

Published by:

IJSRET Volume 4 Issue 6, Nov-Dec-2018

Uncategorized

Research on optimization of sand Moulding process
Authors: Manoj Bhandari, Mayank Sharma

Abstract:-Casting is a manufacturing process in which molten metal or liquid is converted into desired shape for its application in different fields like automobile parts, firefighting parts, valve parts, electrical equipment. The solid parts received after the whole process is known to be casting .This process have been known for thousands of years and is one of the most popular and simplest method of casting that allows products to be generate at reasonable cost at manufacturing industry. In order to meet the overgrowing demand of the consumer industry need to produce large amount accurate products at suitable time so the companies are trying every possible means In order to done casting most of them are using hit and trial method to casting. That results involvement of error, time consumption and loss of cost in the process of casting .Origination of casting defects is one of limitation that is need to be consider as to diminish the effect of defects in the final casting product. Defects like gas porosity, shrinkage, mold material defects. As to determine the optimum condition for sand casting is challenging task. In order to find out optimum condition as to acquire minimum defects for best possible results. Selection of material and process is very important task which we have to keep in mind. Selection various parameters like grain size of 300 and 600 mics ,moisture content of 26 % and 28 % as to produce the sand moldings and test the sample for tensile and compressive strength is key method of our working .In this project we have taken 4 samples with different condition of operation in the end made an comparison between them to find the optimized result . The results indicated that the selected process parameters significantly affect in improving casting defects like gas porosity, shrinkage in foundry lab. This paper illustrates the optimizing the process parameters of sand casting process including optimum levels and the case study are done in foundry lab. The result obtained in the experiment demonstrate that smaller grain size particles with medium level of moisture(300 mics sand & 28% water content) delivers optimum results for casting.

Magnonic Holographic Memory
Authors: Gunjan, Tushar Karmakar

Abstract:-Data storage has been fundamental part of the computing from the beginning. The technology for data storage has been evolving along other side over areas of computing technology. As these other areas of technology have developed the need for bigger and fastest storage has been evident. To meet the need for better storage new technologies are constantly being developed as the old technologies reaches its physical limits and become outdated. Holographic memory has received attention in recent years as a technology that can provide very large storage density and high speed. Information is recorded in holographic medium through interface of two coherent beams of light .We refer to information carrying beams as signal beam and interfering beam as reference beam. The resulting interference pattern causes an index grating (the hologram) to be written in the material. When hologram is to subsequently illuminate within one of the original reference beams, light is diffracted from the grating in such a way that the signal beam is reproduce. Many holograms can be multiplexed within the same volume of the material by angle, fractal, wavelength, and phase code; multiplexing. Holographic memory can read and write data in parallel allowing for much higher data transfer speeds. Unlike conventional storage media such as magnetic hard disks and CD-ROM’s which can access only 1 bit at a time, each access of a holographic memory yields an entire data page-more than a megabit at a time. Holographic random access memory design leads to the implementation of compact and inexpensive memory modules that can be used to construct large read-write memories.

A Study of Gulbarga(Kalaburagi) City(Karnataka), India
Authors:Mala Ramesh Ramchandra rao, Dr. K.Vijakumar

Abstract:-The maintenance of healthy nature is an integral part of urban planning. Well maintained greenery contributes immensely to various sectors and forms income generating program me, thereby enhances property value. Gulbarga is popularly known as a ‘Cement Kashi & Bowl Of Pulses’. The buildings in the city are architecturally beautiful and are endowed with well-maintained parks, open spaces One of the adverse effects of rapid and relative unplanned growth is heavy encroachment leading to the problem of shrinking green space and City Corporation resources are inadequate to fully meet the basic domestic needs of the city’s green spaces. Gulbarga has a total of 248 parks in addition to a large number of institutional open spaces and avenue plantations. The main objective of cultivation and management of trees is their contribution to the physical, social and economic well being of the urban community. The city has a only one Lake-Sharnabasaveshwar Lake, That Add To The Beauty And Environmental Value Of The City. This Lake Is Popular Picnic Spot and Is Frequented By Nature Lovers. The Sewage Water Increases The Capacity Of Local Lake During Monsoon.

Trends in Nursing Education
Authors: Asst. Prof. Beulah Jasmine Rao

Abstract:-Nursing is dynamic by its own way and this dynamism give rise to various trends. Sound empirical knowledge is the base of nursing as in any other profession. This knowledge is the base for all the innovations which in turn evolve as trends in nursing. The trends in nursing education are the cornerstone for the dynamic nature of nursing profession. The article outlines various trends in nursing education with reference to India. The trends are organized under the areas of Curriculum Innovations, Technology & Nursing, Student Population, Clinical Teaching Learning Process, Evaluation System, Quality Assurance, Knowledge expansion & Modes of Education.

Identity based Data Sharing and Profile Identifying For Healthcare Applications in Cloud Storage
Authors: M.Tech.Scholar D.Param Jyothi, Asst. Prof.T. Venkataramana, Asst. Prof. Karamala Suresh

Abstract:-Online social networks (OSNs) have become popular around the world due to its openness. Although cryptographic techniques can provide privacy protection for users in OSNs. Cloud computing and social networks are change the way of healthcare by providing real-time data sharing in a cost-effective manner. However, data security issue is one of the major goals to the extensive application of mobile healthcare social networks (MHSN), since health information is considered to be extremely sensitive. In this paper, we introduce a secure data sharing and profile matching scheme for MHSN in cloud computing. The patients can outsource their encrypted health records to cloud storage with identity-based broadcast encryption (IBBE) technique, and allocate them with a group of doctors in a secure and efficient manner. We then present an attribute-based conditional data re-encryption construction, which permits the doctors to convert a cipher-text into a new cipher-text of an identity-based encryption scheme for specialist without leaking any sensitive information. Further, we provide a profile matching mechanism in MHSN based on identity-based encryption with equality test, which helps patients to find friends in a privacy-preserving way, and achieve flexible authorization on the encrypted health. Moreover, this mechanism reduces the computation cost on patient side.

On Electric Power Supply Chain Model For Three Different Tariff Customers in South East Nigeria
Authors: Osisiogu, U. A., Okafor, C. N.

Abstract:-The study proposed a new model of electric power supply chain networks in the Nigeria situation for three different tariff category customers. The network allows for multiple power generators, transmission, and distribution, retailing and demand customers. The supply chain network introduces retailing of power from distributors to demand customers. We derived the optimality conditions of the decision- makers and proved that the governing equilibrium conditions satisfy a variation inequality problem. The variation inequality problem for a single power generator, single transmission, single distributor, single retailer and a demand customer was used to illustrate the method. The multi start optimization method was used to solve the inequality using specified start up value obtained from the Nigerian Electricity Regulatory Commission (NERC) for residential customers with single phase supply with single meter with consumption on 50KWH and below (R1) tariff demand customers, the startup value employed were the average daily of power distributors receive from power generators and the unit cost of power by distributors ; for customers with single phase supply with single phase meter with consumption above 50KWH (R2S) tariff demand customers, the startup value employed were the average daily of power distributors receive from power generators and the unit cost of power by distributors ; for customers with three phase supply with three phase meter with consumption below 45KVA (R2T) tariff demand customers, the startup value employed were the average daily of power distributors receive from power generators and the unit cost of power by distributors, The result of the analysis showed that the least cost of power regardless of the category of customer is N 469.15 while the highest cost is N 473.32. It found that the highest shadow price was N 978.93 for R1 customers while the least was N 928.61 for R2T customers.

A Survey on Internet of Things
Authors: Research Scholar Deepika. N, Professor Dr.M. Anand

Abstract:-The study proposed a new model of electric power supply chain networks in the Nigeria situation for three different tariff category customers. The network allows for multiple power generators, transmission, and distribution, retailing and demand customers. The supply chain network introduces retailing of power from distributors to demand customers. We derived the optimality conditions of the decision- makers and proved that the governing equilibrium conditions satisfy a variation inequality problem. The variation inequality problem for a single power generator, single transmission, single distributor, single retailer and a demand customer was used to illustrate the method. The multi start optimization method was used to solve the inequality using specified start up value obtained from the Nigerian Electricity Regulatory Commission (NERC) for residential customers with single phase supply with single meter with consumption on 50KWH and below (R1) tariff demand customers, the startup value employed were the average daily of power distributors receive from power generators and the unit cost of power by distributors ; for customers with single phase supply with single phase meter with consumption above 50KWH (R2S) tariff demand customers, the startup value employed were the average daily of power distributors receive from power generators and the unit cost of power by distributors ; for customers with three phase supply with three phase meter with consumption below 45KVA (R2T) tariff demand customers, the startup value employed were the average daily of power distributors receive from power generators and the unit cost of power by distributors, The result of the analysis showed that the least cost of power regardless of the category of customer is N 469.15 while the highest cost is N 473.32. It found that the highest shadow price was N 978.93 for R1 customers while the least was N 928.61 for R2T customers.

Need Ride Friend Carpooling and Chat Using Android System
Authors: I. Piranavanath, Mathula. T

Abstract:-One in every five people in the world has a Smartphone. Use of Smartphone is increasing day by day. The aim of this system is to user using carpooling with messaging android app in the smart phone. This project is about creating a social media application to explore the uses of augmented reality which can be used to improve the user experience in social media. Through the application the users will be able to rate live locations like restaurants, supermarkets, Shopping malls, Arcades and also free places and save live memories in a virtual world created by the augmented reality application. This is going to be very interesting because this social media has these enormous features. The user will feel more lively. They can do useful things through this social media. This will be an android platform. Assume a user wants to go to a place. User can search that place through our social media. Then user will get the list of ratings which were given by his friends. Also when user reaches a place, user can switch on the camera and by our augmented reality application the user can see the friends lively in that specific area. The project will use cloud storage to have a database of the users. It will use Google maps Application program interface (API) in order to get the users locations to the exact point. This application also can understand when the user is entering as well as exiting the place. So while exiting the place the user will get a feedback form to be filled. So that, it will help the user to rate the driver ride.

Structural Properties of Pr1-xSrxFeO3 materials
For Solid Oxide Fuel Cells

Authors: Padigela Srinivas Reddy, V. Prashanth Kumar, Suresh Sripada

Abstract:-Pr1-xSrxFeO3 (0≤x≤0.6) is a cathode material, which are set up by sol-gel method. In the example the powders are framed in the scope of 6000C – 7000C. The Sol-Gel is unit for the examples; these described as single stage GdFeO3 kind of perovskite structure utilizing powder XRD. For the example the Orthorhombic distortion diminishes with expanding x esteem. The decrease of pseudo cubic cross section steady is down to earth with Sr content. The crystallite measure for undoped test is in the range 25nm to 30nm and 55nm to 60nm as-incorporated and burn powders individually. The weight reduction is 3% to 26% and in augmentation of x esteem the weight reduction is diminished for as-synthesized powders.

IOT Based Secured Smart Home Automation System Using Raspberry Pie
Authors: Ansari Huzaifa, Ansari Affan, Shaikh Mohammed Ahmed, Shaikh Adeen Shaikh, Mohammed Ashfaque

Abstract:-This documentation presents a home automation system implemented using the Raspberry pi. This paper is a proof of project for a home that can be controlled remotely through the use of device. A controlled home like this can make life more convenient and also safer. The device monitoring aspect of this project demonstrates the ability of being able to know what is going on with different systems at home which can be used for control and safety. For example, the state of some sensors for intruder detection and the state of different devices like fans or lights at home. It is also demonstrated using a few motors, how one can control different systems at home using the Internet. So, a virtual “switch” available in the cloud UI can be toggled to turn on/off a fan at home. This project has a large scope and can be accommodate with many other systems like smart electronic appliances at home. This documentation describes the project implementing the basic framework to achieve such a connected home. It gives a combination of use of hardware and software in the current implementation of the project, future improvements and scope. This project also adds a CCTV camera at the door for intruder detection.

Review of Prediction of Product Recommendation Using Clustering Technique and Voting Scheme
Authors:M. Tech. Scholar Versha Patel, Asst. Prof. Pritesh Jain

Abstract:- On the Internet, the place the number about Decisions might overpower, there will be need will channel, organize Also adequately pass on vital information so as on alleviate those issue of larger part of the information over-burden, which require made a plausibility issue will enormous numbers Internet client. Recommenders methods work out this issue by glancing through enormous volume for quickly made lion’s share of the information on outfit client with redo substance and administrations. This paper examines the different angles and possibilities for particular expectation procedures secured close by suggestion methods set up with serve concerning delineation a compass to Scrutinize and act in the field of proposal strategies.

Review of Study of Stock Market Prediction
Authors:M.Tech.Scholar Sachine Kumawat, Asst. Prof. Pritesh Jain

Abstract:- OSecurities trade conjecture incorporates predicting future estimation of association stock or other cash related instrument traded on an exchange. Distinctive sorts of trading ought to be conceivable in securities trade. It could be at this very moment trading or even whole deal trading anyway if someone can envision the regard or class of that component, it can yield incredible return for the endeavor done. Going before improvement of automated world, markers continued using paper work procedures like essential and particular examination. Distinctive important particular markers like SMA, EMA, and MACD saw to be uncommonly valuable yet with the happening to PC headways and computations, estimate moved into mechanical area. Specialists started fabricating desire structure using Neural Network, Support Vector Machine, Decision Trees, and Hidden Markov Model. Figure precision really improved using algorithmic methodology. This overview covers diverse traditional as well as developmental data burrowing systems used for securities trade desire.

Identity-based Data Auditing and Hiding for Secure Cloud Storage
Authors:M.Tech. Scholar R. Haritha, Asst.Prof. M. Reddi Durga Sree, Asst.Prof. Karamala Suresh

Abstract:- Cloud computing is one of the significant developments that utilizes progressive computational power and upgrades data distribution and data storing facilities. With cloud storage services, users can remotely store their data to the cloud and realize the data sharing with others. Remote data integrity auditing is proposed to guarantee the integrity of the data stored in the cloud. In some common cloud storage systems such as the electronic health records system, the cloud file might contain some sensitive information. The sensitive information should not be exposed to others when the cloud file is shared. Encrypting the whole shared file can realize the sensitive information hiding, but will make this shared file unable to be used by others. How to realize data sharing with sensitive information hiding in remote data integrity auditing still has not been explored up to now. In order to address this problem, we propose a remote data integrity auditing scheme that realizes data sharing with sensitive information hiding in this paper. In this scheme, a sanitizer is used to sanitize the data blocks corresponding to the sensitive information of the file and transforms these data blocks’ signatures into valid ones for the sanitized file. These signatures are used to verify the integrity of the sanitized file in the phase of integrity auditing. As a result, our scheme makes the file stored in the cloud able to be shared and used by others on the condition that the sensitive information is hidden, while the remote data integrity auditing is still able to be efficiently executed. Meanwhile, the proposed scheme is based on identity-based cryptography, which simplifies the complicated certificate management. The security analysis and the performance evaluation show that the proposed scheme is secure and efficient.

Prediction of Diabetes Mellitus Using Data Mining Techniques A Review
Authors: M.Tech.Scholar Varsha Rathour, Asst.Prof. Pritesh Jain

Abstract:- Data mining strategies are utilized to discover intriguing examples for restorative finding and treatment. Diabetes is a gathering of metabolic illness in which there are high glucose levels over a delayed period. This paper focuses on the general writing review identified with different information digging strategies for foreseeing diabetes. This would assist the specialists with knowing different information digging calculation and technique for the expectation of diabetes mellitus.

Review Paper of Corresponding job on Multi-core Processors
Authors:-M.Tech.Scholar Anil Malviya, Assistant Professor Mr.Pritesh Jain

Abstract:- Multi-focus processor advancement has been enhanced awesomely and it is sensibly extraordinary in execution than single focus processors in this way having the ability to engage estimation concentrated nonstop applications with correct arranging confinements. By and large standard multiprocessor nonstop arranging is stick to Sequential models which slight intra-undertaking parallelism while Parallel models, for instance, Open MP have the ability to parallelize specific areas of assignments, thusly provoking shorter response times when possible. In this paper diverse research papers have been assessed and are named Sequential Real-Time Task based Research and Parallel Real-Time Task based Research. Furthermore unique strategies, for instance, attempted part frameworks, arranging methodologies and systems used are considered for differentiating progressing errand booking in multi-focus processors.

Secure VANET Using Trust Management System
Authors:-M.Tech. Scholar Rakesh Mukati, Asst. Prof. Mr. Pritesh Jain

Abstract:-Sooner rather than later we realize that vehicles will speak with one another to make Vehicular specially appointed system and gives the idea of wise transportation framework. In this paper we displayed the survey of security in VANET. Thusly, a few specialists spoke to the attacks and arrangements in vehicular correspondence we investigated a portion of the security issues and proposed answers for defeated it. We talked about the requirement for hearty Vehicular Ad hoc organizes, which is unequivocally subject to their security and protection highlights. This paper will audit the current attacks in VANET in the point of view approach of security. We likewise gave the arrangements to the specific attack in VANET.

Control of Statcom to Enhance Stable Power System Operation
Authors:-M.Tech.Scholar S. Reddiah Raju, Asst. Prof. P., Asst.Prof. V.Sunil Kumar Reddy

Abstract:-In this paper, two control methods namely adaptive voltage control method and d-q axis control method are proposed to ensure proper operation of static synchronous compensator (STATCOM) to provide an efficient and effective means of controlling power system stability. In the aforesaid control methods PI controller has been used to regulate the constant DC link voltage in the DC link capacitor as well reactive power, grid voltage and current. The simulations are conducted in MATLAB/SIMULINK platform. The performance of the STATCOM is investigated both for normal state and disturbances of the power system network. The obtained results ensure that the STATCOM with the mentioned controller has performed well to maintain power system stability by controlling both bus voltages and reactive power in the power system network.

A Survey of Load Balancing in Cloud Computing
Authors:- M. Tech. Scholar Komal Malakar, Assistant Professor Pritesh Jain

Abstract:-Load Balancing is basic for productive activities in circulated situations. As Cloud Computing is developing quickly and customers are requesting more administrations and better results, stack adjusting for the Cloud has turned into an extremely fascinating and vital research zone. Numerous calculations were recommended to give proficient components and calculations to doling out the customer’s solicitations to accessible Cloud hubs. These approaches intend to upgrade the general execution of the Cloud what’s more, give the client all the more fulfilling and effective administrations. In this paper, we explore the diverse calculations proposed to resolve the issue of load adjusting and undertaking planning for Cloud Processing. We talk about and contrast these calculations with give a review of the most recent methodologies in the field.

A Comparative Study of Delay-Tolerant Network And It’s Various Methods
Authors:- Rohit Hirvey, Ranjan Thakur

Abstract:-Currently the applications along with the research work on the domain of delay-tolerant network has now become very famous these days. Within the delay-tolerant-network there are few storage space is available in each node. Therefore in the absence of the links on the nodes the packets may be stored within the storage space. Delay-tolerant-network is capable to enable the services for communication within the unreachable and mostly in unfriendly areas. Actually the delay-tolerant-network forwarding-algorithms may route the traffic towards the specific nodes to increase the delivery rate and reduces the delays, whereas the traffic required raising these nodes that are now become not useful. In this paper, represent the review on the architecture of the delay-tolerant-network which is discovered at with the some features of the Delay-Tolerant-Mobile-Network routing=protocol within the delay-tolerant-network’s problems.

Analysing of Impact of Code Refactoring on Software Quality Attributes
Authors:-Himanshi Vashisht, Asst. Prof. Sanjay Bharadwaj, Sushma Sharma

Abstract:-Code refactoring is a “Process of restructuring an existing source code.”. It also helps in improving the internal structure of the code without really affecting its external behavior”. It changes a source code in such a way that it does not alter the external behavior yet still it improves its internal structure. It is a way to clean up code that minimizes the chances of introducing bugs. Refactoring is a change made to the internal structure of a software component to make it easier to understand and cheaper to modify, without changing the observable behavior of that software component. Bad smells indicate that there is something wrong in the code that have to refactor. There are different tools that are available to identify and remove these bad smells. Software has two types of quality attributes- Internal and external. In this paper we will study the effect of clone refactoring on software quality attributes.

Review of Image Retrieval System Using Three Level Searching
Authors:-M. Tech. Scholar Saloni Verma, Asst. Prof. Sachine Mahajan

Abstract:-The enormous advanced picture databases are yielded by the across the board of shrewd gadgets alongside the exponential development of virtual social orders. If not combined with effective Content-Based Image Retrieval (CBIR) apparatuses, these databases can be counter-profitable. The presentation of promising CBIR frameworks had been seen a decade ago which advanced applications in different fields. In this publication, an investigation on best in class content-based picture recovery which incorporates hypothetical and observational work is propounded. This work contains distributions that cover parts of research significant to CBIR territory. In other words, unsupervised and administered instruction and mix strategies alongside which the low-level picture visual descriptors have been accounted for. Moreover, difficulties and applications that seemed to convey CBIR explore have been examined in this work.

Survey Paper of Wireless Sensor Network
Authors:-M.Tech. Scholar Akshay Gupta, Asst. Prof. Megha Singh

Abstract:-In recent years there is a quick advancement in the field of Wireless sensor network. This paper gives brief introduction of Wireless sensor associate with its applications in the field of condition, structure checking, keen home watching, Industrial application, prosperity, military, vehicle recognizable proof, blockage control and RFID tag. With movement in WSN, little and simplicity sensor centers end up available, which have limits of Wireless correspondence, recognizing diverse sorts of biological conditions and data getting ready. There are different sorts of coordinating traditions depending on application and framework designing. Guiding traditions give route in the framework and capable multi-skip correspondence. WSNs can be found in various applications like normal native and military by and large which get a handle on enemy interference area, challenge following, calm watching, living space checking, fire acknowledgment and cutting edge.

Dissolved Gas Analysis of Transformer
Authors:-Viral R Avalani, Kishan V Dave

Abstract:-The Dissolved Gas Analysis has been widely used by utilities throughout the world as the primarily diagnostic tool for transformer maintenance. The gas generated in mineral oil of power transformer can be monitored by all conventional methods, online DGA system and by application of various Artificial Intelligence practices. Assessment techniques for judging power transformer conditions and lifespan has been eye-catching. Dissolved Gas Analysis (DGA) has proved to be useful tool for diagnostic of incipient and potential faults in power transformers. This paper discusses pros and cons of the various DGA methods in practice and also deals with an experimental investigation carried out to study relation between gas generation and partial discharge. In the second part, a fuzzy logic based interpretation method (FLI), which is based on fuzzy set theory is described and implemented as an improved DGA interpretation method that provides higher reliability and precision of the fault diagnostics.

Iot Based Healthcare Environment System
Authors:- Assistant Professor D.Thilagavathi

Abstract:-The Internet of Things (IOT) evolved in various application areas that include medical care or health care. This technology helps for the patients and doctors to forecast the a variety of diseases exactly and diagnose these diseases according to result, to development of Internet of Things (IoT) more sensors, actuators and mobile campaign have been deployed in to our daily life. The IoT of things has numerous applications in healthcare, from remote monitoring to smart sensors and medical device integration. It has prospective to not only keep patients safe and healthy, but to get better how physician distribute care as well. IoT data generated by multi-modal sensors or devices show great differences in formats, domains and types. The full application of this paradigm in healthcare area is a shared hope because it allows medical centers to function more knowledgeably and patients to obtain better treatment, data semantization in IoT, generous out flows, extensive techniques, including its backgrounds, existing challenges and open issues.

Cycle Time Optimization in Injection Moulding
Authors:- M.Tech.Scholar Munendra Koli

Abstract:-Injection moldings is plastics equivalent of metal die casting. This is the most widely used method of mass producing to close tolerance three dimensional articles over a wide range of sizes and variety of shapes. The parameters are evaluated against the problem of optimizing the cycle time for each part. The Experimental data were collected on IIM Milacron following the designed experiment procedures, and a statistical analysis was performed to give a basis for process improvement recommendations. The results of the experiment showed a way to achieve the goal of optimizing the cycle time in injection molding in a sensible and cost efficient way. In this paper we studied that how to minimize cycle time, how many possible ways are there to optimize the cycle time, what factors influences the cycle time.Any manufacturing activity would like to have optimized productivity and quality. In injection moldings of plastics, if quality is taken care of by part design, mold design and mold precision, then is also ensured on account of zero defect moldings without rejection and optimized cycle time.

Analysis about Classification Techniques on Categorical Data in Data Mining
Authors:- Assistant Professor P. Meena

Abstract:-In recent years, huge amount of data is stored in database which is increasing at a tremendous speed. This requires need for some new techniques and tool to intelligently analyze large dataset to acquire useful information. This growing need demands for a new research field called Knowledge Discovery in Database (KDD) or data mining. The main objective of the data mining process is to extract information from a large data set and transform it in to some meaning full form for future use. Classification is the one of data mining techniques which is used to classify categorical data item in a set of data in to one of predefined set of classes or groups, in this paper, the goal is to provide a comprehensive analysis of different classification techniques in data mining that incudes decision tree, Bayesian networks, k-nearest neighbor classifier & artificial neural network.

A Study of Fraud Identification in Online Payment Instruments Using Data Mining Techniques
Authors:- Assistant Professor R. Ramya

Abstract:-The aim of the present work is to survey and analyze the use of electronic payment instruments and fraud detection on banks across the country using statistics and information retrieved from the Central Bank and the data mining techniques. For this purpose, initially, according to the volume of the transactions carried out and using the K-Means algorithm, a label was dedicated to any record. Viewing profile data mining solution that valuable but hidden in mass volume of online transactions, gives valuable information related to this criminal process loses. We analyze some of the recent approaches and architectures where data mining has been applied in the fields of e payment systems. In this study we limit our discussions to data mining in the Context of e-payment systems.

A Study on Image Mining Tools Techniques and Framework
Authors:- Assistant Professor C. Rukmani

Abstract:-Image mining is a vital method which is used to mine knowledge from image. Image segmentation is the primary phase in image mining. Image mining handles with lot of unknown information extraction, image data association and additional patterns which are not clearly accumulated in the images. Image data represents a keystone of many research areas including medicine, forensic criminology, robotics and industrial automation, meteorology and topography as well as education. Therefore, obtaining specific information from image databases has become of great importance. Images as a special kind of data differ from text data as in terms of their nature so in terms of storing and retrieving. Image mining as a research field is an interdisciplinary area combining methodologies and knowledge of many branches including data mining, computer vision, image processing, image retrieval, statistics, recognition, machine learning, artificial intelligence etc. This paper aims at reviewing the current state of the IM as well as at describing challenges and identifying directions of the future research in the field.

Product Development Process Using Tendering and Bidding
Authors:- Akash Borasi, Prof. Ravi Nagaich

Abstract:- The development process model is very important to integrated product development. In this paper, the integrated product development architecture was established based on the detail analysis of customer relations between product model and its corresponding process model, where the product data management system was viewed as the exchanging interface and operating platform of information. The integrated development mode by the coupling of product model main line and process model main line was presented. In this mode, both product model and process model are modified synchronously, and this modification is dynamic and mutual. The coupling mechanism between product model and its corresponding process model was established, and the algorithm was also given to evaluate the model coupling effect.

An Enhancement Synchronous Reference Frame (SRF) based of SAF Transmission system
Authors:- M.Tech. Scholar Shivani Patel, Astt. Prof. Gaurav Katare

Abstract:- The advantage of parallel ac-dc power transmission for the improvement of transient and dynamic stability and damp out oscillations have been established. Present paper proposes a simultaneous ac-dc power flow scheme through the same transmission line to get the advantages of parallel ac-dc transmission to improve stability and damping oscillations as well as to control the voltage profile of the line by controlling the total reactive power flow. Only the basic idea is proposed along with the feasibility study using elementary laboratory model. The main object is to emphasize the possibility of simultaneous ac-dc transmission with its inherent advantage of power flow control. Control methods based on selective harmonic elimination pulse-width modulation (PWM) techniques with fuel cell system offer the lowest possible number of switching transitions and improve the voltage level in SAF transmission system. This feature also results in the lowest possible level of converter switching losses. For this reason, they are very attractive techniques for the voltage-source- converter-(VSC) based high-voltage dc (HVDC) power transmission systems.

An Implimentation of Safety Management Practices And Hazard & Operability Study In Multinational Api Bulk Drugs Company
Authors:- M. Tech. Scholar Abhishek Rawal, Astt. Prof.Vivek Shukla

Abstract:-Hazard and Operability Analysis is a structured and systematic technique for system examination and risk management. In particular, HAZOP is often used as a technique for identifying potential hazards in a system and identifying operability problems likely to lead to nonconforming products. HAZOP is based on a theory that assumes risk events are caused by deviations from design or operating intentions. Identification of such deviations is facilitated by using sets of “guide words” as a systematic list of deviation perspectives. This approach is a unique feature of the HAZOP methodology that helps stimulate the imagination of team members when exploring potential deviations.

Model for the Creation of Electricity within a Piezoelectric Material under Strain
Authors:- Ramasombohitra NivonjyNomenAhy, Rastefano Elisée

Abstract:-This paper introduces the basis about piezoelectric effect. The reason why electricity is created in piezoelectric material under an external constrain is discussed in detail. The investigation is based on the basic structure of a molecule that composes piezoelectric crystals.

Evolution of Cryptography Based Sheltered Messaging System
Authors:- Assistant Professor L. Gandhi

Abstract:-This paper introduces the basis about piezoelectric effect. The reason why electricity is created in piezoelectric material under an external constrain is discussed in detail. The investigation is based on the basic structure of a molecule that composes piezoelectric crystals.

Load Balancing Using Time Complexity of Proposed Algorithm on Cloud Computing
Authors:- M.Tech. Scholar Smriti Verma, Asst. Prof. Ashish Tiwari

Abstract:-Cloud Computing is a developing field and lean toward by numerous one at current yet it’s rage is part more rely upon its execution which thusly is excessively rely upon the powerful booking algorithm and load adjusting . In this paper we address this issue and propose a algorithm for private cloud which has high throughput and for open cloud which address the issue of condition awareness likewise with execution. To enhance the throughput in private cloud SJF is utilized for planning and to conquer shape the issue of starvation we utilize limited pausing. For stack adjusting we screen the heap and dispatch the activity to the minimum stacked VM. To acquire advantage and to have open door for future upgrade out in the open cloud condition cognizance is the key factor and for better execution and load adjusting likewise wanted. While stack adjusting enhances the execution, the earth awareness increment the benefit of cloud suppliers.

Dark Channel and Laplace Based Under Water Image Restoration
Authors:- M.Tech. Scholar Sonam Shrivastav, Asst. Prof. Priya Jha

Abstract:-As the digital world is increasing day by day so number of digital image processing issues are cover by different researchers. Out of those this work focus on under water noise removal which is also known as visibility restoration refers to different methods that aim to reduce or remove the degradation that have occurred while the digital image was being obtained. This paper has utilized the Laplace base distortion detection with dark channel technique for image restoration. Combination of both these techniques helps in identifying the actual color values present in the original image scene. Experiment is done on many images of different environment or category. Results shows that LEDCR (Laplace Edge Based Dark channel Restoration) is better as compare to CBF CBF in [8].

Customer Relation Management an Impact in Tirupur Garment Industries
Authors:- K. Prabha Kumari, S.Tamilvanan

Abstract:-Customers are the most important stake holders in running the business efficiently in the market. In order to retain the customers the CRM act as a tool for managing and servicing the customers. CRM is the step by step process where the manufacturing organizations are following to sustain their customers. In Tirupur most of the Textile manufacturers are using Customer Relation Management as a strategy to retain the customers. In this analysis the researcher taken the impact of CRM and how the CRM act as a tool for efficient increase in turnover of the business. The manufacturers using CRM Software to enable the smooth vendor relationship with internal sales activities, automatic workflow rules, better data organization and enhanced communication to the customers. CRM is the significant component in the business to perceive by prospects of the customers. In this study the impact of CRM was analysed and number of suggestions are given to the manufacturers to develop the customer relationship management in the Tirupur city.

Comparision of High Performence Exclusive or Gate Using 130nm Fet Technology

Authors:- M.Tech.Scholar Pankaj Kumar Sikdar, Asst. Prof. Rajesh Kumar Paul

Abstract:- An adder is a switch based digital circuit that performs addition of numbers. In many computers and other kinds of processors adders are used in the arithmetic logic units. They are also utilized in other parts of the processor, where they are used to calculate addresses, table indices, increment and decrement operators, and similar operations. Full Adder/Half Adder is one of the smallest elements used in the complex data processing unit to perform fast arithmetic/logical operations which is designed by EX-OR/EX-NOR gate. The main aim of this paper is a design of existing systematic cell design methodology (SCDM) based three input EX-OR gate and compare it with older one. Intension behind a novel SCDM based design is to improve propagation delay and reduce Energy/power dissipation. Traditional dynamic N-channel FET, dynamic P-channel FET and new hybrid type systematic cell design methodology (SCDM) is designed and compare with proposed on the basis of following parameter like propagation time delay, average dynamic power consumption, energy consumption, and Area on SoC. It is observed that the SCDM based design has least delay, power dissipation, energy consumption and occupies minimum no. of transistor.

Static and Materialistic Analysis of Crankshaft
Authors:- PG. Scholar Surabhi Sharma , Dr. P.K. Sharma

Abstract:- Crankshaft is one of the large components with a complex geometry in internal combustion engine which converts the reciprocating displacement of the piston into a rotary motion. the main reason of failure was determined as lower surface hardness followed by rapid wear due to the contact of crankpin and bearing surface. The contact was resulted due to absence of oil and improper lubrication The modeling of the single cylinder petrol engine crankshaft is created using Auto-Cad Software. Finite element analysis (FEA) is performed to obtain the variation of stress at critical locations of the crank shaft using the ANSYS software. The material of the crankshaft EN-19 has been changed to ADI (Austempered Ductile Iron) and then the properties were compared with previous material.

A Survey on Outsourced Data Privacy Preserving Techniques Association Rule
Authors:- Phd. Scholar Ravindra Tiwari, Associate Prof. Priti Maheshwary

Abstract:- Protecting user personal knowledge is a crucial concern for society. The daily use of the word privacy concerning secure data sharing and analysis is commonly imprecise and should be dishonest. To protect privacy of individual, many strategies will be applied on knowledge before or on the method of mining. The branch of study that embody these privacy considerations are referred as Privacy Preserving Data Mining (PPDM). So this paper focuses on this problem of increasing the robustness of the data. Here various approaches adopt by researchers are detailed with their field of security. Some of issue related to the papers are also discussed. Various approaches of association rule mining are explained for finding the or hiding the hidden information as well.

Optimal design of Renewable Energy, Water and sewage Pumping System for a community, Case Study New El-Farafra Oasis, Egypt
Authors:- M. Osama abed el-Raouf, Adel. Mallawany, Mahmoud A. Al-Ahmar, Fahmy M El Bendary

Abstract:- The design and evaluation of a stand-alone hybrid renewable energy system for pumping underground water for a newly proposed community in EL-Farafra-Egypt are presented. Solar radiation, wind speed and environmental conditions for the proposed location are given. Moreover the loads are calculated carefully according to Egyptian codes, the optimal size of the system components are obtained using the simulation tools HOMER based on economic optimization criteria represented in the net present cost and the cost of energy. The compared water pumping systems are; PV only, Wind turbine only, PV/wind turbine. The study was illustrated for climatic conditions of an isolated area in El-new Farafra oasis, Egypt. Water pumping system is simulated for drinking purposes in the proposed site. The results show the NPC and COE are lower in the case of PV only and increased by using a wind turbine system due to the lower wind speed rates in the specified location. The COE was0. 215 $/kWh for PV only and 0.835$/kWh for Wind only.

Enhancing Colour Development of Photo chromic Prints on Textile
Authors:- Asst. Prof. D. Anita Rachel, T.S. Thirumalaivasan

Abstract:- Textile UV-radiation sensors has lately been introduced to the field of smart textiles. Inkjet printing has been used as means of application due to the effective and resource efficient process. UV-LED radiation curing has been used in combination with inkjet printing in favour of low energy requirements, solvent free solution and reduced risk of clogging in the print heads. The problems arising when exposing photo chromic prints to UV-radiations are that oxygen inhibition during the curing and photo-oxidation in the print reduces the prints ability to develop colour. It is the oxygen in the air in combination with UV-radiation that gives the photo-oxi dating behavior. The aim of the study is to with the aid of physical protection reduce the effect of oxygen inhibition and photo-oxidation in the prints. Three types of physical treatments were used, wax coating, protein based impregnation and starch based impregnation. Treatments were applied before curing as well as after curing and the colour development after activation during 1 min of UV-radiation was measured with a spectrophotometer. Multiple activations were also tested to see how the treatments affected the fatigue behaviour of the prints over time. The aim was to have as high colour development as possible reflecting reduced oxygen inhibition and photo-oxidation. Results showed significantly higher colour development for samples treated with wax and whey powder before curing, but reduced colour development for amylase impregnation. Over time whey powder before curing showed highest colour development due to highest initial colour development. Lowest fatigue was seen for washed samples containing the chemical stabilizer HALS, showing an increased colour development. In reference to earlier studies the protective properties of wax and whey powder is due to their oxygen barrier properties protecting the print. The tested treatments have shown that it is possible to reduce the effect of photo-oxidation during curing leading to prints giving higher colour development. This gives a great stand point when improving existing and future application of photo chromic prints on textiles.

Design of Cost Aware Secure Routing (CASER) Protocol in Wireless Sensor Network
Authors:- M.Tech.Scholar G. Vamsiviraj, Asst. Prof. A. Uday Kishore

Abstract:- Remote sensor system of all inclusive circulated self-ruling sensors to screen physical or ecological conditions, for example, temperature, commotion, weight and so on and to helpfully go their information through the system to a primary area. CASER convention is utilized to build the lifetime of the system. Vitality utilization and security are the two clashing issues in WSN. In this paper, first propose a novel secure and proficient Cost-Aware Secure Routing (CASER) custom to address these two clashing issues through two compact parameters: vitality equity control (EBC) and probabilistic-based abstract strolling, at that point affirm that the noteworthiness use is really disproportional to the uniform essentialness plan for the given system topology, which fantastically reduces the lifetime of the sensor structures. To manage this issue, we propose proficient non-uniform vitality affiliation logic to move the lifetime and message development degree under a similar importance asset and security fundamental. Correspondingly give a quantitative security examination on the proposed controlling convention. For the non-uniform centrality plan, our examination displays that we can expand the lifetime and the aggregate number of messages that can be passed on under a similar Speculation. Also propose rest prepared state means complete a high message transport degree while anticipating organizing blocking strikes.

Socioeconomic Impacts of Tertiary Institution Cultism on Nigeria Youths History and Overviews
Authors:- Agha Romanus Urom, Okpani Otunta ESQ, Farouk Agha Uche, Emmanuel Ogbonnaya Egwu

Abstract:- Cultism is one of the significant indecencies standing up to the Nigerian instructive area today. The exercises of these gatherings have caused the passing of many. Mystery Cult bunches are widespread in higher foundations in the Country and have turned into a smear on the correct improvement of the Nigerian youth. The fundamental conditions for continued scholarly culture have been disintegrated in tertiary establishment. Also, it has been a negative effect in tertiary establishment which little consideration has been given to the outcome. Here and there understudies are assaulted, killing with acids, charms, cleavers, Knives, firearms, sunlight and pack rapping, death toll, property and badgering of female understudies. There is no harmony in grounds, arrangement of examination negligence, murdering of teachers. Advertising as critical thinking correspondence are relied upon to make methodologies to illuminate the understudies and make them mindful of what is required of in a scholastic.In order to tackle this problem successfully this paper recommends that both the students and general public and the government should take the challenges upon themselves. It focuses on the relevance measures in which public relations and any external body could adopt to solve social phenomenon in tertiary institution in Nigeria. Therefore this paper tries to portray the relationship and effect on socioeconomic wellbeing of youths in Nigeria tertiary institutions.

A Review on Smart Grid and its Application
Authors:- PG. Scholar Kiruthiga Devi V, Prof. Venkatesan T, Asst.Prof. Sri Vidhya D

Abstract:- Electricity is most versatile and widely used form of energy. The growing worldwide population is dynamic and will create more increase of electricity. Providing stable and sustainable electricity supply is a heavy stress on today’s grid. The power grids must be modernized to meet the needs of 21st century society and economy, which increasingly rely on digital and electronic technologies. The Conventional energy sources like coal are also depleting day by day. The Smart Grid pays way to deliver the Growing demand for power and to minimize the increased complexity of power grids. Smart Grid integrates modern technologies and renewable energy resources in to future power grid in order to supply more efficient and reliable electric power. This paper gives overview about the role of fault current limiter, phase measurement unit in smart grid, the demand side management, voltage stability problems in smart grid and application of smart grid in hybrid vehicle.

Techniques for Fully Integrated Intra/Inter Chip Optical Communication
Authors:- Mrs. K.P. Joshi

Abstract:- Inter/ Intra chip optical communication eliminates all data and control pads generally present in conventional chip. It replaces them with a new type of ultra-compact, low power optical interconnect communication technique. It enables entirely optical through-chip buses that could service hundreds of thinned stacked dies. Even in tight power budgets, very high throughputs and communication density could be achieved. The core of the optical interconnect is a CMOS single-photon avalanche diode (SPAD) operating in pulse position modulation. CMOS compatible optical interconnect techniques based on miniaturized optical channels. By using optical communication technique one can achieve high throughputs i. e. several gigabits per second at very low cost in terms of area and power dissipation, so as to represent a real alternative to conventional systems. In this seminar communication between inter/intra chip will be discussed using optical communication technique.

Highly Confidential Security System Using Otp
Authors:- M.Tech. Scholar Alka Porwal, Asst. Prof. Ajit Saxena

Abstract:- In today’s crazy busy life style it is not very uncommon for us all to be forgetful. We often fail to remember our passwords, mail ids, pan card numbers, passport details, study certificate numbers etc. this kind of data is confidential and at present we store them manually (i.e mobiles, sticky notes) which is very easy to lose or even hacked. The “Highly Confidential Security System” Aims at developing a web application through which user can store his confidential data in a very secured way.

Image Stenography Using Reversible Texture Synthesis
Authors:- M.Tech.Scholar Sheshank Porwal, Asst. Prof. Ajit Saxena

Abstract:- Steganography is the art of covered or hidden writing. The purpose of steganography is covert communication-to hide the existence of a message from a third party. This paper is intended as a high-level technical introduction to stegano graphy for those unfamiliar with the field. It is directed at forensic computer examiners who need a practical understanding of stegano graphy without delving into the mathematics, although references are provided to some of the ongoing research for the person who needs or wants additional detail. Although this paper provides a historical context for steganography, the emphasis is on digital applications, focusing on hiding information in online image or audio files. Examples of software tools that employ steganography to hide data inside of other files as well as software to detect such hidden files will also be presented.

Image Processing Based Optical Character Recognition
Authors:- M.Tech.Scholar Bhawna Singh, Asst. Prof. Ajit Saxena

Abstract:- Handwritten Character Recognition by using Template Matching is a system which is useful to recognize the character or alphabets in the given text by comparing two images of the alphabet. The objectives of this system prototype are to develop a program for the Optical Character Recognition (OCR) system by using the Template Matching algorithm . This system has its own scopes which are using Template Matching as the algorithm that applied to recognize the characters, which are in both in capitals and in small and the numbers used with courier new font type, using bitmap image format with 240 x 240 image size and recognizing the alphabet by comparing between images which are already stored in our database is already . The purpose of this system prototype is to solve the problems of blind peoples who are not able to read , in recognizing the character which is before that it is difficult to recognize the character without using any techniques and Template Matching is as one of the solution to overcome the problem.

Supervised learning in data mining using Transformation Regression Technique
Authors:- M. Tech. Scholar Karan Bahal

Abstract:- The presented paper focuses on supervised learning in data mining and machine learning areas for small data sets. In the paper, the precision of data mining regression model is increased by special transformation technique, which transforms the original regression task into a new regression task, equivalent with the original. In the paper, transformation was successfully applied on synthetic and real data sets with positive results.

Impact of Bromocresol Purple in Organic and Inorganic Matrix on Optical Properties
Authors:- Mohd Nasha’ain Nordin, Nik Mohd Aziz Nik Abdul Aziz, Ahmad Aswad Mahaidin

Abstract:- Sol-gel matrix derived from from organic and inorganic moieties offer interesting features such as chemical and mechanical stability. Mixture of organic vinyltriethoxysilane (VTES) and inorganic tetraethoxysilane (TEOS) produce sol-gel matrices with improvement in density, flexibility and optical properties. Bromocresol purple and 18-crown-6-ether are added into the sol-gel mixture with 75:25 of VTES to TEOS ratio. The effect of incorporated bromocresol purple and 18-crown-6-ether to the sol-gel matrices is study using SEM, TGA, FTIR and UV-Vis. It is found that sol-gel matrices incorporated with bromocresol purple and 18-crown-6-ether improves response sensitivity and has the prospect to be developed as a new sensing materials.After a while a pH-sensitive indicator bromocresol purple (BCP) and surfactant were incorporate into the sol-gel mixture. The percentage of sodium dodecyl sulfate (SDS) and polyethylene glycol (PEG) which act as surfactant were varied to observe the effect of improving host material’s nanostructure as well as the interaction between BCP and sol-gel matrices. The absorption peak of the BCP dye changed significantly in the presence of surfactant compared to pure VTES: TEOS mixture (control) in the range of 400 to 450nm.

Distribution of Electric Field InElectro Osmotic Consolidation of Soil
Authors:- Yaju Sayami

Abstract:- In earlier times, Electro-osmosis of soil was used to extract minerals like gold, iron etc from the soil but in recent days, electro-osmosis has been used for consolidation of soil. Electro-osmosis of soil has been a great boon in the field of geotechnical engineering. Many improvements have been brought to this field like replacing electrodes as Electro-Kinetic Geosynthetics(EKG), which has greatly reduced electric consumption. However, it still has many problems like the soil around the electrodes are found to be more consolidated and dry. In this paper, we will be discussing the different causes and solutions on how to eliminate or reduce this problem by studying the distribution of the electric field in the soil. This is a simulation-based experiment, two models with different dimension and positioning of EKG will be compared, in which distribution of the electric field will be obtained as results. By analyzing the results, the effective positioning of EKG is determined.

A Review Paper for Face Emotion Reconization
Authors:- M. Tech. Scholar Bhawana Choubey, Asst.Prof. Shiv Bhatnagar

Abstract:- Those biometric might be an examination of humanity’s lead method Furthermore Characteristics. Face recognition will be a strategy from asserting biometric. Distinctive philosophies require help used for it. A survey for realize these frameworks is in this paper for dismembering Different computations Also strategies. Face recognition will create expansion from guaranteeing biometric for security as no stands up to might make vanquished Likewise a security approach. Thus, how we could see a face for that help for workstations is accommodated in this paper.

A Literature Survey on Zigbee
Authors:- M. Tech. Scholar Bhawana Choubey, Asst.Prof. Shiv Bhatnagar

Abstract:- This hypothetical tells about a Wireless Technological Device which is notable for to an extraordinary degree Low Power, and Low Bit Rate Wireless PAN Technology called ZigBee. ZigBee is proposed for remote Automation and other lower data endeavors’, for instance, splendid home motorization and remote checking. ZigBee is an insignificant exertion, low-control, remote work sorting out standard. The insignificant exertion empowers the advancement to be for the most part passed on in remote control and watching applications, the low power use permits longer presence with smaller batteries, and the work sorting out gives high steadfastness and bigger expands. Due to the negligible exertion and low power utilize this remote development is commonly used in Home Automation, Smart Energy, Telecommunication Applications, Personal Home, and Hospital Care. ZigBee enables new open entryways for remote sensors and control frameworks. ZigBee is standard based, negligible exertion, can be used all around, strong and self-repairing, supports tremendous number of center points, easy to pass on ,long battery life and secure.

Analysis of Secant Algorithm for Optimal Path Length of Point-To-Point Microwave Link
Authors:- Ezenugu, A. Isaac, Onwuzuruike, V. Kelechi

Abstract:- In this study, Secant algorithm was developed for computing the optimal path length of a point-to-point microwave. Also, the impact of various parameters on the convergence of the algorithm is also presented. Mathlab program was developed and used to carry out sample numerical computations of optimal path length and other performance parameters for a given microwave link. The results showed that convergence cycle of 5 is achieved for a microwave link operating at the frequency of 12 GHz and in rain zone N, with percentage availability of 99.99% . Also, the results showed that the secant algorithm , as presented in this study is suitable for the occasion when rain fading dominates. In that case, the convergence cycle of the secant algorithm is quite stable, maintaining values between 5 and 6 even in the face of variations in frequency, percentage availability and rain rate. In any case, the secant method can still be improved to yield low convergence cycle in the face of multipath fading. Furthermore, the iterative secant method may not be needed when the multipath fading dominate since it is possible to develop close form mathematical solution to determine the optimal path length when the multipath fading dominates.

Development of Empirical Model for Estimation of the Vertical Profile of Radio Refractivity
Authors:- Onwuzuruike, V. Kelechi, Ezenugu, A. Isaac

Abstract:- Values of meteorological parameters vary with altitude. As such, models that can be used to estimate the parameters need to include altitude as one of the variables. In this paper, empirical model is developed for estimating the vertical profile of radio refractivity based on available meteorological data and the altitude at which parameters were obtained. The study was based on Cross River state meteorological data obtained from Radio sounde measurement carried out by Nigerian Meteorological Agency (NIMET). The empirical model estimates radio refractivity as a function of altitude. The model has maximum absolute percentage error of 0.0176%. The model was validated with meteorological data for Akure in the South Western of Nigeria and the model gave maximum absolute percentage error of 0.0849%. The empirical model provides simpler way to determine the vertical profile of radio refractivity from the basic atmospheric parameters, namely, temperature, pressure and relative humidity.

A Survey Diabetes Prediction Using Machine Learning Techniques
Authors:- M.Tech.Scholar Arvind Aada, Prof. Sakshi Tiwari

Abstract:- Diabetes is a one of the main source of visual impairment, kidney disappointment, removals, heart disappointment and stroke. When we eat, our body transforms sustenance into sugars, or glucose. By then, our pancreas should discharge insulin. Insulin fills in as a “key” to open our cells, to enable the glucose to enter – and enable us to utilize the glucose for vitality. In any case, with diabetes, this framework does not work. A few noteworthy things can turn out badly – causing the beginning of diabetes. Type 1 and sort 2 diabetes are the most widely recognized types of the malady, yet there are additionally different sorts, for example, gestational diabetes, which happens amid pregnancy, just as different structures. This paper centers on ongoing advancements in AI which have had noteworthy effects in the recognition and analysis of diabetes.

Zigbee on Wireless Sensor Network
Authors:- M.Tech. Scholar Amol Ramdas Kasar, Prof. Sakshi Tiwari

Abstract:- A wireless sensor network (WSN) comprises of sensors which are thickly conveyed to screen physical or ecological conditions, for example, temperature, sound, weight, and so on. The sensor information is transmitted to organize organizer which is heart of the remote individual zone arrange. In the cutting edge situation remote systems contains sensors just as actuators. ZigBee is recently created innovation that chips away at IEEE standard 802.15.4, which can be utilized in the wireless sensor network (WSN). The low information rates, low power utilization, minimal effort are fundamental highlights of ZigBee. WSN is made out of ZigBee organizer (arrange facilitator), ZigBee switch and ZigBee end gadget. The sensor hubs data in the system will be sent to the organizer, the facilitator gathers sensor information, stores the information in memory, process the information, and course the information to proper hub.

Security on Android Devices
Authors:- M. Tech. Scholar Ronak Jain, Prof. Sakshi Tiwari

Abstract:- This paper contains profundity depiction of security models of current portable working framework like Android, iOS and Windows Phone. These security models are foundations of security on current stages. Regardless of various methodologies of security they share a great deal of for all intents and purpose. This paper additionally contains the most examined security issue of these days, Malware. Depiction of pernicious programming is from Application-based view. Be that as it may, present day working framework has solid assurance against infections and different kinds of disease through its security display, the weakest purpose of cell phones are still clients. These clients for the most part introduce extra programming into their gadgets. This paper centers on Android malware contamination and gives a couple of assurance strategies against this sort security risk.

Rumors detection in Social networking on twitter
Authors:- M.Tech. Scholar Garvesh Joshi , Prof. Sakshi Tiwari

Abstract:- Online internet based life sites like Twitter has turned out to be a standout amongst the most mainstream stages for individuals to get or spread data. Notwithstanding, without any balance and utilization of publicly supporting, there is no certification that the data shared is valid or not. This makes online web based life profoundly vulnerable to the spread of bits of gossip. As a major aspect of our work, we research all things considered a dataset on which talk identification was done in the past in 2009 and perform AI calculations like k-closest neighbor and gullible bayes classifier to recognize tweets spreading bits of gossip. We present the after-effects of our review investigation and extraction of client properties. A calculation for pre-processing on tweet content is proposed to hold key data to be passed on to learning calculation to acquire improved outcomes to the extent gossip discovery precision is concerned.

Identification of Vulgar Comments on social Media using data mining
Authors:- Jash Parekh, Harshad Shewale, Aman Mahajan

Abstract:- Data Mining is the process of finding anomalies, pattern and co- relations within large data sets to predict outcomes. These outcomes can then be manipulated to get our desired results and then work accordingly. Social media has greatly enriched people’s lives, allowing them to share or post their feelings through posting various comments or pictures. Some friends or people comments are such vulgar that usually the person sharing the post deletes the post. Our approach is to detect such vulgar comments and delete them immediately, as soon as they are posted by someone and even after deleting such comments the user on who’s post the comment was posted, gets to know the name of the person and comment which was posted. We are using Quick Sort algorithm for sorting comment. Through this project we aspire to remove negative comments and thus keep posts clean.

Data mining Approach for High utility Mining as Outlier Detection: A Survey
Authors:- Rashmi Rohitas, Prof. Ruchi Dronawat

Abstract:-Data mining is the process of identifying patterns in data sets by applying appropriate methods with cluster of machine learning techniques. In recent decades, high utility item set (HUI) mining has become the emerging research area, which focuses on frequency and also on utility related with the item sets. Each item set has a value like profit or user’s interest, called as the utility of that item set. HUIs are present in customer transaction databases which yield a high profit. The target of HUI is to discover the item sets that have utility value higher than the threshold value. The issues faced in HUIs are dealing with negative item values and number of database scans, mining in XML database, candidate sets and distributed computing network. This paper presents a survey of various algorithms and their restrictions in mining HUIs and the performance analysis of the surveyed algorithms.

A Novel Scheduling Algorithm for Wireless Optical Network
Authors:- Omprakash Choudhary, Alok Shukla

Abstract:- With the increasing of the business carried by the wireless private network and the continuous development of the new network technology, the problems in the traditional mode of wireless private network are becoming more and more obvious. So optical wireless channel were scheduled by observing the packets requirement and available channels. In this work two renowned scheduling algorithm First come First Serve and Shortest data first were compared. Scheduling of optical wireless channel obtained from WDM were done in same environment of data packets delivery. Experiment and results shows that proposed optical wireless channel scheduling by SDF was better than FCFS algorithm on various evaluation parameters.

Innovation Technology for Detection of Tangible & Intangible Failure Modes Through Condition Base Monitoring System
Authors:- Research Scholar T D Sundaranath, Registrar Professor Dr. G. R. Selokar (Supervision)

Abstract:- Primary signals are generally those signals or parameters which are required to assess the performance of the equipments and which are designed to be emanated, such as oscillations in vibratory chutes/ Screens etc. Monitoring of primary signals are termed as “Performance monitoring” or “Performance Trend Monitoring”. All other signals, which appear as loss output, like vibration, sound thermal, chemical or physical changes etc, are termed as secondary signals. Secondary signals are, normally, not designed for. Monitoring primary signals alone does not help in efficient assessment of health and condition of equipments/ machines. As secondary signals are generally result or form of loss output, monitoring of these signals becomes inevitable and, often more necessary, for equipment health monitoring and technical diagnostics. It is again emphasized that selection of condition monitoring parameters. Monitoring points and sampling points, monitoring frequencies and techniques and analysis of monitored parameters/ signals are to be done timely and Efficiently to enable maintenance personnel take timely actions.

Innovation Technology for Detection of Tangible & Intangible Failure Modes Through Condition Base Monitoring System
Authors:- Research Scholar T D Sundaranath, Registrar Professor Dr. G. R. Selokar (Supervision)

Abstract:- Primary signals are generally those signals or parameters which are required to assess the performance of the equipments and which are designed to be emanated, such as oscillations in vibratory chutes/ Screens etc. Monitoring of primary signals are termed as “Performance monitoring” or “Performance Trend Monitoring”. All other signals, which appear as loss output, like vibration, sound thermal, chemical or physical changes etc, are termed as secondary signals. Secondary signals are, normally, not designed for. Monitoring primary signals alone does not help in efficient assessment of health and condition of equipments/ machines. As secondary signals are generally result or form of loss output, monitoring of these signals becomes inevitable and, often more necessary, for equipment health monitoring and technical diagnostics. It is again emphasized that selection of condition monitoring parameters. Monitoring points and sampling points, monitoring frequencies and techniques and analysis of monitored parameters/ signals are to be done timely and Efficiently to enable maintenance personnel take timely actions.

DOI: 10.61137/ijsret.vol.4.issue6.427

Artificial Intelligence in Practice: Legal and Ethical Challenges in its Deployment across Sectors

Authors: Research Scholar Aman Malik

Abstract: The rapid deployment of Artificial Intelligence (AI) across diverse sectors—including healthcare, transportation, finance, and governance—has prompted pressing legal and ethical concerns, especially in technologically emerging economies like India. This paper critically examines the legal and ethical challenges surrounding the integration of AI systems in practical applications, with a focus on the Indian regulatory landscape. While AI promises efficiency and innovation, it also raises fundamental questions of accountability, privacy, bias, and transparency. Key issues such as the attribution of liability for autonomous decisions, the ethical implications of algorithmic discrimination, and the lack of a clear legal framework for AI-generated data and actions are discussed. The paper further explores the limitations of existing Indian laws, including the Information Technology Act, 2000 and the absence of a dedicated AI or data protection statute (pre-GDPR adaptation). Drawing on global standards and domestic case studies, this study proposes a need for robust regulatory mechanisms, ethical design protocols, and sector-specific governance to ensure responsible AI deployment. The findings aim to contribute to the evolving discourse on AI governance and serve as a foundational reference for future legal reforms in India.

 

Legal Implications of Data Breaches and Cybersecurity Failures

Authors: Research Scholar Aman Malik

Abstract: In an increasingly digitized world, data breaches and cybersecurity failures have emerged as significant legal and regulatory concerns for both public and private sector entities. This paper explores the legal implications associated with data security incidents, focusing on regulatory frameworks, liability issues, and enforcement actions in place prior to December 2018. Key legislation such as the European Union’s General Data Protection Regulation (GDPR), the United States’ sector-specific laws (including HIPAA and the GLBA), and emerging legal standards in Asia are examined. The study analyzes landmark data breach cases to highlight the evolving role of compliance, corporate responsibility, and the consequences of negligence in cybersecurity governance. It also discusses the legal challenges organizations face in cross-border data breaches and the implications for international cooperation. By assessing these issues through a legal and technological lens, the paper provides guidance on risk mitigation, legal preparedness, and the necessity for robust cybersecurity policies to meet growing regulatory expectations.

 

Empowering Developer & Operations Self-Service: Oracle APEX + ORDS As An Enterprise Platform For Productivity And Agility

Authors: Shravan Kumar Reddy Padur

Abstract: Enterprise IT teams increasingly seek platforms that allow developers and operations staff to deliver solutions without lengthy provisioning cycles, as delays in provisioning can undermine agility, competitiveness, and customer satisfaction. Oracle Application Express (APEX), when paired with Oracle REST Data Services (ORDS), provides a powerful low-code environment that accelerates the creation of self-service portals while ensuring secure and governed access to enterprise data. By 2018, the platform had advanced significantly, introducing REST-enabled SQL for seamless data exchange, Oracle JET-based visualization for modern, responsive interfaces, and tighter integration with cloud-native deployment models—shifting its role from a lightweight departmental tool into a strategic enterprise platform. These capabilities enabled organizations to foster a culture of innovation, empower non-specialist developers, and reduce the workload on centralized IT teams. This article examines how APEX self-service applications contribute to enhanced developer productivity, operational efficiency, and alignment with contemporary DevOps and SRE practices, with supporting architectural diagrams highlighting the progression from traditional production topologies to secure, cloud-based deployments.

Empowering Developer & Operations Self-Service: Oracle APEX + ORDS As An Enterprise Platform For Productivity And Agility

Authors: Shravan Kumar Reddy Padur

Abstract: Enterprise IT teams increasingly seek platforms that allow developers and operations staff to deliver solutions without lengthy provisioning cycles, as delays in provisioning can undermine agility, competitiveness, and customer satisfaction. Oracle Application Express (APEX), when paired with Oracle REST Data Services (ORDS), provides a powerful low-code environment that accelerates the creation of self-service portals while ensuring secure and governed access to enterprise data. By 2018, the platform had advanced significantly, introducing REST-enabled SQL for seamless data exchange, Oracle JET-based visualization for modern, responsive interfaces, and tighter integration with cloud-native deployment models—shifting its role from a lightweight departmental tool into a strategic enterprise platform. These capabilities enabled organizations to foster a culture of innovation, empower non-specialist developers, and reduce the workload on centralized IT teams. This article examines how APEX self-service applications contribute to enhanced developer productivity, operational efficiency, and alignment with contemporary DevOps and SRE practices, with supporting architectural diagrams highlighting the progression from traditional production topologies to secure, cloud-based deployments.

DOI: https://doi.org/10.5281/zenodo.17292089

 

The influence of digital twins on predictive infrastructure management

Authors: Manoj Patil

Abstract: Digital twins are revolutionizing the realm of predictive infrastructure management by offering enhanced capabilities for monitoring, analysis, and maintenance planning in complex infrastructure systems. By creating virtual replicas of physical assets, digital twins enable real-time data integration, simulation, and predictive analytics, facilitating timely decision-making for infrastructure performance optimization and risk mitigation. The adoption of digital twin technology addresses many challenges faced by traditional infrastructure management, including aging assets, dynamic environmental influences, and the need for sustainable operations. This article explores the multifaceted impact of digital twins on predictive infrastructure management, highlighting their role in predictive maintenance, asset lifecycle management, risk assessment, and system optimization. The integration of advanced sensor networks, Internet of Things (IoT) devices, and artificial intelligence (AI) with digital twins further enhances their predictive power, enabling proactive responses to emerging infrastructure issues. The article also discusses the implementation challenges, data security concerns, and the future outlook of digital twin technology in infrastructure sectors such as utilities, transportation, and smart cities. Through a comprehensive examination of current trends, use cases, and technological advancements, this article provides a detailed understanding of how digital twins are shaping the evolution of infrastructure management practices, ultimately contributing to enhanced resilience, cost efficiency, and sustainability.

DOI: https://doi.org/10.5281/zenodo.17776010

The influence of hybrid storage systems on large-scale data analytics performance

Authors: Priyanka Sharma

Abstract: Hybrid storage systems have increasingly become a pivotal architecture in the realm of large-scale data analytics, addressing the ever-growing demand for managing vast volumes of diverse data with speed and efficiency. By integrating multiple types of storage media, typically solid-state drives (SSDs) and hard disk drives (HDDs), hybrid storage optimizes data accessibility and throughput by leveraging the performance benefits of faster storage technologies alongside the cost-effectiveness and capacity of traditional drives. This synergy is particularly crucial in large-scale data analytics, where substantial datasets must be rapidly processed to derive actionable insights, impacting industries such as finance, healthcare, telecommunications, and scientific research. The influence of hybrid storage systems transcends mere data warehousing, affecting the efficiency of data retrieval, latency, system throughput, and computing cost. These systems support the flexible caching of hot data in faster tiers, while colder, less frequently accessed data remains in slower storage, thereby creating a dynamic environment that can adapt to workload variations. Furthermore, the architecture of hybrid systems is conducive to scalability and fault tolerance, essential features when dealing with petabyte-scale analytics clusters and distributed frameworks like Apache Hadoop and Spark. This article explores the architecture of hybrid storage systems, the performance implications they bear on large-scale data analytics, and the cost-performance balance they offer. Additionally, it examines case studies demonstrating improvements in real-world analytics applications, the challenges in managing hybrid storage environments, and future trends in storage technologies impacting analytics performance. By understanding these aspects, enterprises can better architect their storage infrastructure to meet the demanding requirements of data-intensive analytics workloads.

DOI: https://doi.org/10.5281/zenodo.17776151

The Impact Of Explainable AI On Improving Transparency In Security Decision Systems

Authors: Tenzin Dorji

Abstract: The rapid integration of Artificial Intelligence (AI) into cybersecurity has significantly enhanced threat detection, intrusion prevention, and decision-making capabilities. However, as AI models become increasingly complex, their decision processes often operate as “black boxes,” making it difficult for human analysts to understand, verify, or trust their outcomes. This lack of interpretability poses critical challenges to transparency, accountability, and ethical governance in security decision systems. In recent years, Explainable Artificial Intelligence (XAI) has emerged as a transformative approach to bridge this gap by making AI systems more interpretable and transparent without substantially compromising performance. XAI seeks to ensure that every automated security decision whether related to intrusion detection, access control, or malware classification is supported by understandable and justifiable reasoning. The concept of explainability in AI-based security systems is grounded in the need for trustworthy AI, where users, auditors, and stakeholders can comprehend how and why a system made a particular decision. This is particularly crucial in security domains where decisions have direct implications for privacy, compliance, and risk mitigation. For instance, when an intrusion detection system flags anomalous network behavior, it is not sufficient to merely report the event; analysts must also understand which features or patterns triggered the alert. XAI methods such as Local Interpretable Model-agnostic Explanations (LIME), SHapley Additive exPlanations (SHAP), and attention-based visualization frameworks provide the interpretive mechanisms required for this understanding. These tools offer insights into the model’s internal logic, allowing for greater collaboration between AI systems and human security experts. This review paper explores the theoretical foundations, technical methodologies, and practical implications of XAI in enhancing transparency across diverse security decision systems.

DOI: http://doi.org/10.5281/zenodo.17839883

The Influence Of Digital Twin Simulations On Optimizing Enterprise Cloud Infrastructure

Authors: Priya S. Bhatia

Abstract: Digital twin (DT) technology virtual replicas of physical assets, systems, or processes has emerged as a transformative approach to optimizing enterprise cloud infrastructure. By enabling real-time data integration, predictive modeling, and adaptive simulations, digital twins provide organizations with unprecedented visibility and control over cloud environments. This review explores how digital twin simulations influence the performance, scalability, cost-efficiency, and resilience of enterprise cloud systems. It examines the underlying principles, enabling technologies, and industrial implementations that drive this integration. Through a synthesis of recent research and case studies, the paper identifies how DTs enhance decision-making by forecasting workloads, detecting failures before they occur, and recommending optimal configurations for resource utilization. Additionally, the review highlights how digital twins enable dynamic cloud scaling, energy optimization, and service reliability through continuous feedback loops and machine learning-driven simulations. However, challenges remain, including interoperability issues, high computational costs, and cybersecurity risks. The paper also presents emerging trends such as edge-cloud digital twins, AI-driven automation, and sustainable infrastructure optimization. The findings underscore that digital twin simulations are not just tools for operational efficiency but strategic enablers of intelligent, autonomous cloud ecosystems. Ultimately, the review provides a foundation for future research into standardization, security, and advanced analytics in twin-driven cloud infrastructures.

DOI: http://doi.org/10.5281/zenodo.17839944

Operationalizing Regulatory Governance Through Enterprise Master Data Design: A Practical Examination of OFAC, KYC, and GDPR Controls

Authors: Nagender Yamsani

Abstract: This study examines how enterprise master data design can be operationalized as a primary mechanism for regulatory governance within highly regulated financial environments. The research addresses a persistent industry challenge where regulatory obligations such as OFAC screening, customer due diligence, and personal data protection are often implemented as isolated compliance processes rather than embedded into core data architectures. The purpose of this work is to demonstrate how governance-first master data management can translate regulatory intent into enforceable, auditable, and scalable enterprise controls. Using a qualitative case-based methodology grounded in architectural analysis, control mapping, and operating model assessment, the study evaluates how regulatory requirements are structurally realized through master data domains, stewardship workflows, validation checkpoints, and exception handling mechanisms. The findings show that treating master data as a governed control layer enables consistent regulatory enforcement across operational systems, reduces manual remediation cycles, and strengthens audit readiness. The study further highlights how clear ownership models, policy-driven data validation, and controlled synchronization patterns contribute to sustained compliance without constraining business operations. From an academic perspective, the research extends governance and information systems literature by positioning master data architecture as a regulatory execution instrument rather than a purely technical capability. From an industry standpoint, the study provides practical guidance for financial institutions seeking to embed compliance obligations directly into enterprise data foundations, reinforcing trust, transparency, and operational resilience.

DOI: http://doi.org/10.5281/zenodo.19019592

Professional Development Priorities Among Different Age-Based Groups Of Higher Education Faculty In Institutes Of Delhi

Authors: Dr. Suman Dhawan

Abstract: Faculty Development Programs (FDPs) are very important to the careers of teachers in higher education institutions. They become better at what they do, and the entire institution is improved. But the fact is that faculty members are not all alike, and age is actually a factor in what they want from an FDP. This research explores how the interests of faculty change with age. On the basis of a structured survey of 302 faculty members from various universities and colleges, a One-Way ANOVA test was conducted to determine how needs differ in three age groups: 25-34, 35-44, and 45 and above. The findings are quite striking. Age does make a difference. The younger generation is more concerned with handling classes and establishing a sound foundation in subject matter. The middle-aged faculty begin to tilt towards competency development and professional growth. The 45+ age group is more concerned with developing their personalities and management acumen. To synthesise all this, this study proposes an Age-Life-Cycle Model of Faculty Development Priorities. The study concludes that "one-size-fits-all" solutions do not work. If universities are serious about faculty development, they need to listen to where people are in their life cycle and provide development that fits.

DOI: http://doi.org/10.5281/zenodo.18584290

Next-Generation Satellite Link Budget Analysis for Transcontinental Communications

Authors: Pratikbhai Patel

Abstract: In this research paper, the proposed link budget framework is an elaborate link budget analysis framework of a next-generation Low Earth Orbit (LEO) satellite constellations to facilitate seamless transcontinental communications. The paper examines the technical needs required to support high-availability broadband coverage on Earth in dynamic orbital and atmospheric conditions. It combines the free-space path loss models, rain fade models, atmospheric attenuation models, orbital mechanics, adaptive modulation models, optical inter-satellite links integration, and interference resilience in an International Telecommunication Union (ITU)-compatible framework. The results indicate that dynamic environmental modeling, dynamic transmission methods as well as propulsion-enhanced orbital stability are important in ensuring that link margins are consistent in geographically dispersed areas. The study also gives prominence to the need to incorporate climate sensitive attenuation forecasting, spectrum agility, and security-oriented interference mitigation in order to make the system more robust. The paper concludes with the discovery that the next-generation LEO constellations have the potential to scale to low-latency and resilient transcontinental connectivity in case it is backed by an integrated and dynamic link budget design methodology. This framework offers a technically rigorous basis to satellite communication systems in the future in the whole world.

DOI: https://doi.org/10.5281/zenodo.19093210

Economic Determinants Of Skilled Labor Migration: A Cross-National Analysis Of Business Economics And International Professional Mobility

Authors: Dipikaben Solanki

Abstract: Skilled labor migration has proven to be one of the major economic issues of the world in terms of the workforce distribution, productivity increase and the stability of the labor market in different countries. The trend of growing globalization has increased mobility of professionals especially since developing economies are losing their talent and the advanced economies are experiencing mounting cases of labor shortages. This paper will discuss the economic factors that drive skilled migrants with emphasis on the wage differentials, unemployment rates, and the level of skill factor that determine international migration choices. There is quantitative cross-national secondary data approach that has used comparative evidence of great labour-exporting and labour-importing economies within G20 framework. The analysis indicates that rational economic assessment plays a significant role in the determination of the migration decisions in which the skilled professionals migrate to exploit the maximum income potential, employment security and long-term productivity gains. The results show that the microeconomic instability in the country of origin plays a great role in enhancing the propensity of migration whereas technological progress and structured demand of labor draws highly skilled workers into the developed economies. The paper also indicates that the unmanaged migration also leads to workforce imbalance which further strengthens inequalities in the global labor markets. Policy implications point at the significance of bilateral labour agreements, domestic human capital investments, and controlled migration regimes that facilitate sustainable circulation of skills instead of permanent loss of talent. The study offers a comprehensive economic and policy framework that promotes a balanced mobility of workforce that will assist G20 countries to harmonize migration governance to long-term economic competitiveness and labor market sustainability.

DOI: https://doi.org/10.5281/zenodo.19250248

Published by:

Toward Self-Optimizing Enterprise Data Pipelines: AI-Assisted Performance Tuning for PL/SQL and Informatica Workflows

Uncategorized

Authors: Srujana Parepalli

Abstract: Performance optimization of enterprise data pipelines has traditionally relied on rule-based heuristics, manual tuning cycles, and the accumulated intuition of experienced practitioners; however, as data volumes scale into terabytes and petabytes, workloads become increasingly heterogeneous, and execution environments span databases, ETL engines, and distributed infrastructure, these approaches struggle to deliver consistent and timely results. This paper presents an AI-assisted performance tuning framework for PL/SQL execution environments and Informatica PowerCenter workflows that augments established database performance metrics such as execution plans, wait events, resource utilization, and ETL session statistics with machine-learning-driven optimization techniques capable of learning from historical workload behavior. Building on foundational research in automatic database tuning, self-managing and autonomic systems, and ETL performance engineering, the proposed architecture continuously correlates workload characteristics, configuration parameters, and observed performance outcomes to generate data-driven recommendations for optimal SQL execution strategies, memory and session configurations, partitioning schemes, and workflow design patterns. By synthesizing academic research and industry practices published between 2000 and 2017, the study illustrates how AI-based optimization complements traditional tuning methods by reducing manual intervention, improving adaptability to changing data patterns, and delivering measurable improvements in throughput, latency, and operational stability across large-scale enterprise data platforms.

Published by:

Wire Transfer / Telegraphic Transfer Details

Uncategorized

Dear Author: Kindly use below transaction details for the “Wire Transfer / Telegraphic Transfer”. Transfer amount is mention in your acceptance mail.

Correspondent Bank Details
  • Bank Name: YES Bank
  • SWIFT: YESBINBB
  • Account Number: 011961900001439
  • IFSC Code: YESB0000119
  • Branch Address: MP NAGAR Zone 1 BHOPAL M.P. India
Further Credit To
  • SWIFT/ BIC Code: YESBINBB
  • Beneficiary Name:Deng Infotech solutions Pvt LTD
  • Contact Person Name: Deepak Patel
  • Beneficiary Account No: 011961900001439
  • Purpose: Paper / Article Publication Charges
Published by:

Call For Paper International Journal

Uncategorized

International Journal IJSRET invite researcher of Science and Engineering field to submit their latest review paper or research paper or study paper or letter or implementation paper. Here Editor has call for paper international journal publication under following points:

  • Paper or article should have innovative title with latest research area content.
  • Paper must have good content with deep understanding of topic
  • Paper should not be highly plagued
  • Proper reference should be used in the paper
  • Papers written in English language are invited

Submit Your Paper  / Check Publication Charges

Paper publication sites when Call for Paper International Journal

Number of question present in authors mind are list below with their answer.

Q1. Validity of international journal?

Ans: Type 8 digit number ‘2395566X’ of journal on www.portal.issn .org, you can get validity and year of publication Open website

Q.2 Impact factor of journal?

Ans: 3.24 is impact factor

3 Review time after submission?

Ans: Normally it takes 3 to 4 days.

4 Jounal Processing fees ?

Ans: Check Journal Processing fees: https://ijsret.com/article-processing-fee/

5 Jounal paper submission?

Ans: Submit paper through: https://ijsret.com/paper-submission/

6 Journal Increase processing fees as per number of author?

Ans: No, journal do not increase processing fee. Its constant.

7 Journal Increase fees as per number of pages?

Ans: No, journal do not increase processing fee. Its constant.
Published by:

Best Journals To Publish Research Papers

Uncategorized

In order to find the research validity of any field researcher publish paper on Journal. But this take to other confusion of finding the best journals to publish research papers. So scholars who have just start there research carrier need to check following points of an journal and find that either journal is good or not.

  • Journal should have ISSN number (It’s a 8 digit Id)
  • Journal should have high Impact factor obtain from various resource
  • Journal should be at-least five year old
  • Journal should Journal should have proper submission form with Email-Id
  • Journal should have reviewer list
  • Journal should have valid Payment Gateway (People ask for person account payment is not good)
  • Communication should be responsive by email or chat
  • Journal should have copyright form and Paper format

Submit Your Paper  / Check Publication Charges

So how can one come to know about this:

Open this site and type 8 digit number of the journal: www. portal. ISSN .org, you can get validity and year of publication.

Be aware of journals that show improper indexing by images or just type name of good indexes only. So Best Journals To Publish Research Papers have the following points:

1. Regular issue and volume

2. Journal issues have papers as well at least 3 or more.

3. Published Paper should be from one or country author.

4. Journal publication fees should be clear to author if journal is paid.

Published by:

IJSRET Volume 4 Issue 5, Sep-Oct-2018

Uncategorized

Development of Secure Image Transposal Algorithm Using 16 *16 Quantization Table
Authors: Vijay Bhandari, Dr. Sitendra Tamrakar, Dr. Piyush Shukla, Arpana Bhandari

Abstract:–Computerized picture scrambling can make an picture into a totally diverse good for nothing picture amid modification,and it is a preprocessing amid stowing away data of the computerized picture, which too known as data camouflage.Picture scrambling innovation depends on data cover up innovation which permits non-password protection computation for data cover up.Data stowing away innovation driven to a insurgency in the fighting of organize data,since it brought a arrangement of unemployed combat computations,and a part of nations pay a part of considerations on this zone. Organize data fighting is an imperative portion of data fighting, and its center thought is to utilize open organize for private data transference.The picture after scrambling encryption computations is chaotic,so aggressor cannot disentangle it.A few made strides advanced tidemarking innovation can apply scrambling method to alter the dispersion of the mistake snippet in the picture to move forward the strength of computerized tidemarking innovation.Arnold scrambling computation has the highlight of effortlessness and periodicity, so it is utilized broadly in the computerized tidemarking technology.

Enhancement of Thermal Plant Efficiency by Using Double Pipe Heat Exchanger
Authors: M. Tech. Scholar Indrajit Patidar, Asst. Prof. Nilesh Sharma

Abstract:–The heat exchanger section of a thermal power plant uses water as a coolant, which is incapable of extracting the low grade heat input in the heat exchanger section. This results in loss of a major chunk of heat energy from the thermal power plant. However, with the use of a proper coolant in the heat exchanger section, it is possible to utilize the heat available in the heat exchanger. This can be done by introducing a secondary cycle for energy generation along with the conventional primary cycle. Isobutane, with its appropriate physical properties, proves to be a good coolant in the secondary cycle for energy extraction from the low grade heat available in the heat exchanger section. The efficiency of a conventional thermal power plant can be improved from 80% to around 87 – 87.65% by incorporating this change. Although, the paper details the improvement of efficiency in a thermal power plant, same methodology can be used in any steam operated power plant, such as in nuclear power plants, in geothermal power plants, or in solar thermal electric power plants. The improvement in efficiency leads to lesser burden on non-renewable resources, such as coal and nuclear fuel, and also lowers the pollution effects on the environment. Proper practical implementation of the proposed model has the potential to revolutionize the energy generation paradigm of the world.

A Study of Customer Satisfaction with Business-To-Business Customers Model
Authors: Shashanka. B. K

Abstract:– Most benefit looking for associations are at last kept running by cash and the specific retail stores are the same in that. In retail markets the customers are the hotspot for benefits and organizations trying to procure them are continually growing new techniques keeping in mind the end goal to get the aggressive edge that baits the customers in to their stores. At the point when a company offers products and services that satisfies the requirements of the customer and produces extra esteem, satisfaction and saw quality appropriately then the company has the best possibilities for progress. High customer satisfaction is one focused edge a company can have and one reason why it is such a critical piece of a company’s methodology and has seen such a large number of researches and concentrates after some time is its association with gainfulness. The customers for the situation company X comprise of ordinary family unit purchasers and business-to-business customers. X conducts periodical studies that measure the satisfaction of the ordinary family unit buyers yet no research have yet been done to quantify the satisfaction of the business customers. This research will in this manner give new information to X and the aftereffects of this study can be valuable to the company for increasing better outcomes. As per this study the B2B customers are somewhat happy with the services and products of X. The desires for the customers were likewise met great. The most noteworthy factor for the B2B customers to visit in X was observed to be the area of the company, which had a definitive impact to the expenses for the customers.

Comparison Study of Flash & Fire Point of Bio-diesel Produced by Mustard and Soya Bean Oil
Authors: Asst. Prof. Mahesh Chand Saini, Mayank Sharma, Manoj Bhandari, Md. Nazwazish

Abstract:–Consumption of fossil fuel is increasing day by day that results decrease in amount of fossil fuels present on earth. After 30-40 years these sources are in danger of extinction so as to meet the requirement of human beings there is a need of using the alternative source of fuels. One of them which cause as alternative fuel that is derived from the vegetable oils and fatty acids. It is a key source as an alternative fuel. The process of biodiesel production is completed by the trans-etherification process and it is the chemical reaction between oil and alcohol in the presence of catalyst which results separating bio-diesel and glycerin. The process is completed in 2-4 hours and also depends upon the type of vegetable oil. Properties of Bio-diesel are also depends on the used oil. This paper illustrates the comparison between Soya bean and mustard oil Bio-diesel which is followed by different parameter such as Flash & Fire Point. Various ratios of blends of Bio-diesel (5%, 10%, 15%and 20%) are taken. Both edible and non-edible oil are used for producing the bio-diesel.

A Review of Methods of Analysis and Mitigation of Landslide
Authors: Emmanuel Arinze, Paul Chibundu Enyinnia, Aju Daniel Ekan, Anthony Chibuzo Ekeleme

Abstract:–Cities around the world that are located on a hilly and mountainous areas are always faced with the challenge of landslide, which can be disastrous and always result to loss of life and properties when it happens, and large expenditures are being incurred by Governments on the investigation, design and implementation of mitigation and preventive measures to reduce the likelihood of the loss of life and economic losses due to landslides. Landslide analysis can be based on hazard or susceptible. The formal is based on evaluation of probability of a landslide failure within a specified period of time and within a given region, while the later is based on classifying the region into several successive classes with different potentials of landslide. Based on the forgoing this paper is focused on the review of landslide hazard analysis and mitigation. There are four ways to a modern landslide hazard analysis which include Inventory approach, Heuristic approach, Statistical approach and Deterministic approach. Mitigation of landslide is based on the type of landslide, the different types of landslide has its own mitigation method which was discuss in this paper type by type. The four approaches can be applied for regional landslide hazard mapping. All the approaches have their shortcomings. The inventory analysis require a lot of maps which require interpretation, the heuristic analysis require a long- period landslide data, which may be insufficient in length of historical records, incompleteness in inventory and a possible mixing of extreme events, the statistical approach looks better as comparing with the weakness of the other three approach, but there still are some further development needed to predict landslide.

Reversible Image Data Embedding using Modified Histogram Feature
Authors: M. Tech. Scholar Kanchan Sahu, Asst.Prof. Pravin Malviya

Abstract:–With the increase in the digital media transfer and modification of data is very easy. So this work focus on transferring data by hiding in image. In this work carrier image was used to hide data where Low Frequency Region was utilized. Here whole data hiding is done by modified by using histogram feature shifting method. This approach was utilized to the point that hiding information and image can be effectively recovered. Investigation is done on genuine dataset image. Assessment parameter esteems and demonstrates that proposed work has keep up the SNR, PSNR values with high security of the information.

An Improved Approach for Tracking in Maneuvering Target Environment
Authors: M. Tech. Scholar Umesh Kumar Ahirwar, Asst. Prof. Nitin Choudhary

Abstract:–A Modern sensor networks, we need most important parameter is Self-organizing capacity for tracking maneuvering (non-constant) targets. for performance gained lot of attention to point out the control and coordination problem in self-organizing sensor network environment used flocking based methods. In this paper we are these type two well-known algorithms, namely, the Flocking and the Semi-Flocking algorithms. Although these two algorithms have demonstrated promising performance in tracking linear target(s), they have deficiencies in tracking maneuvering targets. Flocking algorithm is applied for tracking the target in maneuvering environment. In this paper we are analyzing the performances of flocking-based algorithms, both with and without the proposed approach, are examined in tracking both linear and maneuvering targets. Experimental results demonstrate how flocking algorithm yields better tracking of maneuvering targets, and how applying flocking concept on the target tracking process to improves the quality of tracking and increases the speed of convergence.

A Review on Different Algorithms for Tracking in Maneuvering Target Environment
Authors: M. Tech. Scholar Umesh Kumar Ahirwar, Asst. Prof. Nitin Choudhary

Abstract:–Self-organizing capacity is the most important need of modern sensor networks; particularly for tracing or tracking maneuvering (non-constant) targets. In this paper we presented the different technique of maneuvering Tracking algorithm & also we are proposing which technique is best for this tracking system. In this research brief literature review of different existing algorithm for maneuvering tracking system and proposed a system for maneuvering tracking system for modern sensor network.

Digital Image Retrieval Using Annotation, CCM and Histogram Features
Authors: M. Tech. Scholar Shalinee Jain, Asst. Prof. Sachin Malviya

Abstract:– As the quantity of web clients are expanding every day. This work concentrate on the retrieval of pictures by using the visual and annotation characteristics of the images. In this work two kind of features are utilized for the bunching of the picture dataset. So Based on the comparability of annotation, CCM and histogram components of the picture bunches are made. For bunching here genetic approach was utilized. Here client pass two kind of queries first was content while other is image, this assistance in choosing suitable cluster for retrieval of picture. Analysis was done on genuine and artificial set of pictures. Result demonstrates that proposed work is better on various assessment parameters as contrast with existing strategies.

Development of Safety and Productivity Correlation for A Rolling and Wire Drawing Factory
Authors: M. Tech. Scholar Kamal Shukla, Asst. Prof. Vijay Shankul

Abstract:– The work productivity of workers in a company is affected by several factors, one of which is occupational safety and health program and there is a significant effect of safety and health program on work productivity of workers either simultaneously or partially. A poor safety standard primarily originates from the belief that safety and productivity are mutually exclusive objectives, one eating away the other. But in practice they are correlated. The knowledge of exact nature of dependence of productivity with safety and health in industry is `therefore highly significant in the context of production. In this project safety and productivity are the two parameters have been focused and the mutual correlation between the two has been analyzed on the basis of the data obtained from the industries various department of safety and productivity and has been presented in the tabulated form by using the data with respect to safety and productivity. The developed correlation is a functional relation between total factor productivity and safety elements and cost of production. Hence, the developed correlation can be used to predict total factor productivity (TFP) from the knowledge of the values of input safety elements and planned cost of production.

Garbage Management for Smart Cities Using Internet of Things
Authors: M. Tech. Scholar Anjali Urmaliya, Dr. Neeraj Shukla

Abstract:– Garbage Management of smart city using “Internet of Things”. In our daily life we produce a lot of waste or garbage as last few years witnessed a tremendous rate of population, and it’s also important to manage that garbage properly generated by peoples. By using Internet of things it will easy to manage all the garbage. I will use some IOT devices for the management of garbage of smart city, smartly. The Devices that will make it complete are “Arduino Uno, Raspberry Pi, ESP8266, MQTT Protocol. It Is will also support the “Swachha Bharat Abiyan” campaign by Prime Minister Of India. The purpose of the mission is to cover all the rural and urban areas of the country to present our country as an ideal country to the world. Waste Management is one of the primary things to solve the problem that the world faces to developing country. The main aim of the work is to develop a smart intelligent garbage alert system to for proper management of garbage.

Controller Analysis of Reaction Wheel System for Counter Torque
Authors: Ye Min Htay, Thu Thu Aung

Abstract:– This work addresses the reaction wheel system (RWS) for a Nano satellite and simulator which shows the function of it. The object is to design flywheels that counteract the disturbance torques experienced by satellite in the harsh space environment and to stabilize the platform of simulator. The performance of reaction wheels is analyzed to achieve the stable for system. The satellite and simulator utilizes three reaction wheels as actuators. The controller is designed to change the rotational speed of reaction wheels to adjust the satellite and simulator in the desired position. The mathematical model of reaction wheel system using angular kinetic equation is developed. Control theory is then applied for a required response that deals with both non linarites in equations and disturbance sources. Simulation of closed loop system shows that all desired specification of closed loop (rising time, settling time and steady state error) are robustly satisfied.

Probing the Effect of Parameters of Wire-EDM on Cutting Speed and Surface Integrity for D2 Steel
Authors: Nitin Kumar, Dinesh Panchal, Naveen Gaur

Abstract:– The indent of alloy steel of high toughness, hardness and strength is proliferating in present era of materials .These materials are used for typical Purpose. As difficulty is encountered in machining of these materials by traditional methods of machining. Wire cut EDM is used to machine them. It has been become an important machining process because it avails an effective way out for generating components made of materials like titanium, zirconium and complex shapes , which are rather difficult to attain by other machining methods. Due to large number of process parameters and responses, a lot of researches have tried to optimize this process. A remarkable amelioration in process efficiency may be obtained by optimization of parameters which recognize the region of censorious process control factors which leads to required response ensuring a lower cost of production.

A Survey on various Techniques of Energy Management in Wireless Sensor Network
Authors: M.Tech.Scholar Prakash Kumar Singh
<p style="text-align: justify;"Abstract:– The availability of sensor devices allow a wide variety of applications to emerge. However, the resource constrained nature of sensors raises the problem of energy: how to maximize network lifetime despite a very limited energy budget. This paper gives a concise study of WSN (Wireless Sensor Network) energy balancing methods proposed by different researchers. Different sorts of requirement of protocols for managing WSN was additionally talked with their significance and limitations. In this study as per working steps of techniques are categorized, so a comprehensive and comparative understanding of existing literature was detailed.

A Survey on Online Social Network Based User Community Identification Techniques and Features
Authors: Asst.Prof.Raju Sharma
<p style="text-align: justify;"Abstract:– As the amount of web customers are extending each day. Presently a user connecting the social personality over the distinctive web-based social networking stages is of basic significance to business intelligence. In this paper, a survey of similarity based user community detection methodologies were discussed for finding the exceptional arrangement of users. Here different features of the social user profiles were identified as per there requirement. So current issues are summarized in the paper for the solution of the work.

GSM Based Gas Leakage Detection System
Authors: Mohit Nankani, Vinay Nagrani, Ajay Mandliya, Jemila Rani
<p style="text-align: justify;"Abstract:– Gas leakage is a major problem with industrial sector, residential premises and gas powered vehicles like CNG (compressed natural gas) buses, cars. One of the preventive methods to stop accident associated with the gas leakage is to install gas leakage detection kit at vulnerable places. The aim of this paper is to present such a design that can automatically detect and stop gas leakage in vulnerable premises. In particular gas sensor has been used which has high sensitivity for propane (C3H8) and butane (C4H10). Gas leakage system consists of GSM (Global System for mobile communications) module, which warns by sending SMS. However, the former gas leakage system cannot react in time. This paper provides the design approach on both software and hardware.

A Study of the Learning Curve of the Japanese Keyboard on Smartphone
Authors: Muhammad Suhaib
<p style="text-align: justify;"Abstract:– Japanese (Nihongo) is an east Asian Language, extensive use of Chinese character called KANJI and Phonetic Characters called KANA (hiragana and katakana). Many people assume Japanese Keyboard on smartphone is difficult to use that’s why I decide to conduct study to determine how easy to learn Japanese Keyboard for new users. Two user with no experience with Japanese keyboard on smart phone were asked to enter some Japanese sentences as fast as they can at least 20 times and same thing was asked to experienced users as well.

Dominance of Artemisia community and Stragalus community in arid area of Iran:True or not?
Authors: Reza E. Owfi
<p style="text-align: justify;"Abstract:– As mentioned in the most of the relevant sources, in other parts of the arid regions of Iran the dominance of theArtemisiaand Astragaluscomunity may vary from an Astragalus species to another species of it, whether individually or in combination. Here are two previous researches on vegetation coverage in two regions of Iran which are located in a group of arid areas and on the other hand, have a large distance from each other: Catchment area of farm Haj Hassan in Yazd province and Maharloo Lake in Fars Province. After examining the two researches, the results of these studies were evaluated and, in the end, analyzed the information obtained from the above studies, which indicates the validity of the aforementioned claim.

Social Security’s Allowances in Nepal and Its Impact on Rural Economy Evidence From Chandrapur Municipality Rautahat
Authors: PhD. Scholar Kirti Raj Subedi
<p style="text-align: justify;"Abstract:– Social security’s allowances program (SSAP) is being popular day by day in Nepal. Nepal’s national social protection program (SPP) aims to control multidimensional deprivations, life cycle risks, and providing people with a basic minimum livelihood. Nepal has launches many social protection program. Nepal have five social security’s allowances program (SSAP) include five schemes targeting: Children of Dalit under-five years of age , Widows and single women over 60 years of age , People with disabilities , Senior citizens over 70 years of age (over 60 years of age in the case of Dalit) , Highly marginalized indigenous ethnic groups (Janajatis). Theses SSAP schemes are managed by the Department of Civil Registration (DoCR) under the Ministry of Federal Affairs and General administration (MoFAGA), and delivered through local governments.The UN has proposed in Sustainable Development goal (SDG) to provide social security’s for all by 2030. (world Bank, 2015).The developing countries are introducing social security’s allowances program with large range with different modalities. Nepal has expending on social security’s 2.3 % of its Gross domestic Product (GDP). Nepal governmenthas providing more than 89 social security’s allowances schemes to the citizens through the various ministry.
Social protection is inevitable for the better lifestyle to the pro poor and the political support also highly acceptable in Nepal but the economic condition and the paying capacity of the government is very low so here is a matter to diabetes how much , to hum should pay by the government. Nepal has just successfully complete the election of three tier government. The social security’s allowances only manage by the federal government and some social protection scheme lunched by the local government. Nepal has several social protection schemes but they have not adequate in term of cash transfer. Lack of integrated social protection policy the cost of government is growing up to manage the social protection program.

Improvement in Cloud Storage Auditing with Verifiable Outsourcing of Secure Key
Authors:M.Tech.Scholar Tanuj Sharma, Asst.Prof. Lakhan Singh, Asst.Prof. Ankur Taneja
<p style="text-align: justify;"Abstract:– Presentation Cloud registering is a current mechanical advancement in the processing field in which for the most part centered around outlining of administrations which can be given to the clients in the same route as the fundamental utilities like nourishment, water, gas, power, and communication. Cloud storage services have become increasingly popular. Because of the importance of privacy, many cloud storage encryption schemes have been proposed to protect data from those who do not have access. In Existing system, there are following associate problems which are worked on our proposed work AES-256 is quite common and easily available for hacker activity in case it desire to break. Highly indexed data structure is not taken in the Existing system. In this paper we are using a stander SHA-2 algorithm for message key generation and for data encryption used optimized Bluefish algorithm after the completed of these process we also find the proxy server in cloud system. For simulation we used cloudsim a java based simulator.

Analysis of Tuberculosis Surveillance Tools at the Ministry of Health:The Case for Computerise/Mobile Surveillance Systems
Authors:Bernard Rotich
<p style="text-align: justify;"Abstract:– Tuberculosis (TB) is a major global health concern, causing nearly ten million new cases and over one million deaths every year. In Kenya, it remains a major cause of morbidity and mortality affecting a substantial percentage of the population. Early detection of TB through surveillance is an effective intervention measure against Tuberculosis. [The general objective of the study was to investigate how TB Surveillance is done in Kenya. The specific objectives of the study were to: establish the current challenges facing TB Surveillance system, review current state of TB surveillance systems, identify the requirements for TB surveillance system, design and develop a mobile integrated prototype system for TB surveillance Qualitative data was analysed to provide a deeper understanding of user requirements of the system which then were used to design and develop a TB surveillance system using Prometheus agent design methodology. The system was implemented on PHP, MySQL for database and Java Agent Development Framework for multi-agent platform. The designed TB surveillance system enabled the medical practitioner to interact with the patient. The tests included the use of mobile phones to capture data from the remote centres. Usability and functionality tests were performed which indicated that the application was an effective surveillance and responsive tool. Finally the study concluded by pointing out the keys areas of future improvements on the existing system and recommendations for future research.

Digital Energy Meter and Fuel Theft Detection Using PIC Microcontroller
Authors:A Jemila Rani
<p style="text-align: justify;"Abstract:– Today’s world haveso many techniques for measurement of any quantity al fuel.Fuel meter areanalog so that we trying to make it digitized to show the fuel value digitally. In our project we show the amount of fuel present in fuel tank digitally .Also fuel theft is a problem in all over world. In our undertaking we demonstrate the measure of fuel present in fuel tank carefully .Also fuel burglary is an issue in all over world. In our venture if fuel gets robbery then instant message will send to proprietor of bicycle likewise ringer influences commotion so proprietor of bicycle to get mindful. In conventional vehicle framework fuel burglary of bicycle can be maintained a strategic distance from.

Power Generating Low Cost Green Electric Vehicle
Authors:Riddik Adhikari, Subrata Mondal
<p style="text-align: justify;"Abstract:– It is low cost much more efficient than present electric cars. We took the concept of Tesla cars and hybrid car concept for our low cost electric vehicle. We have used the motor generator concept. The vehicle will run on green energy and will emit no harmful gases. Moreover it is an eco-friendly concept. It has no complicated machinery so that the maintained cost is less compared to other ICE (Internal combustion Engine) vehicle. We will replace ICE with Motor generator concept which will help in propulsion of car. Hybrid car are popular in these days but the main drawback is that Hybrid are not eco-friendly as it has ICE along with electric motor, battery and several other electrical components. This concept will be more efficient as compared to other electric, ICE and Hybrid vehicles. Having more millage it will be more efficient than present electric cars which will make this concept the best. Last but not the least the importance of the car is that efficiency will not much more difference with respected to loaded and unloaded condition of the car.

Image Optimization and Segmentation by Selective Fusion in K-Means Clustering
Authors: D. Malathi M.E., Asst.Prof.A.Mathan Gopi <p style="text-align: justify;"Abstract:– We present a simple, reduced-complexity and efficient image segmentation and fusion approach. It optimizes the segmentation process of colored images by fusion of K-means clusters in various color spaces, in order to finally get a more reliable, accurate and a non-overlapped image. The initial segmentation maps are produced by taking a local histogram of each pixel and allocating it to a bin in the re-quantized color space. The pixels in the re-quantized color spaces are clustered into classes using the K-means (Euclidean Distance) technique. K-means clustering tends to find clusters of comparable spatial extent, while the expectation-maximization mechanism allows clusters to have different shapes and a selective fusion procedure is followed to reduce the computational complexity and achieve a better-segmented image. The parameters considered for selection of initial segmentation maps include entropy, standard deviation, and spatial frequency etc. The performance of the proposed method is analyzed by applying on various images from the Berkeley image database. The result aims at developing an accurate and more reliable image as compared to other methods along with reduced complexity, processing time and hardware resources required for real-time implementation

Image Optimization and Segmentation by Selective Fusion in K-Means Clustering
Authors: M.Tech.Scholar Sonam Shrivastava, Asst.Prof. Priya Jha <p style="text-align: justify;"Abstract:– As the digital world is growing with various kind of data like text file, image, video. Out of those image plays an important role in different field such as remote sensing, social media, etc. The degradation in image quality may be attributed to absorption and back-scattering of light by suspended underwater particles. Moreover, as the depth increases, different colors are absorbed by the surrounding medium depending on the wavelengths. So maintain the image quality is done by Digital image processing on various issues. This paper give a brief survey of haze and underwater image enhancement techniques for various condition. As environment condition vary from time to time by the presence of fog, dust, water. Image analysis features are describe in this paper with there requirements.

Synthesis Characterization and application of Resins Obtained from Agricultural Waste to Remove Cu (II) from Waste Water
Authors: Mohammad Azam, Dr.Gulrez Nizami, Nida Tanveer,Mohammad Arshad, Sheela <p style="text-align: justify;"Abstract:– The objective of the work is to a removal of the heavy metal ion of Cu(II) from the wastewater of different industries such as mining and, smelting, plating, brass manufacture, petroleum refining, electroplating industries and Cu- based agrichemicals. In this work, some waste agriculture materials are used as the adsorbent for the elimination of copper particle from the wastewater of Copper plants. Some agricultural materials like rice straw, rice bran, rice husk, rice hyacinth roots, coconut shell and, neem leaves were crushed in roll crusher and then grinded to prepare powdered material and then dried in an oven, then it was used as a copper adsorbent in different pH solutions of copper wastewater. Their resulted solutions were measured by FTIR and some other techniques.

A Review of Factors of Cost Overrun in Developing Countries
Authors: Ijaz Ahmad Khan <p style="text-align: justify;"Abstract:– The accomplishment of any improvement activities is as well as its convenient fruition within the targeted on funds, with the simplest attainable fine and setting is of great concern. Price invades are perceived as leading confinement in varied development enterprises of developing nations. To take a look at price overwhelm in development endeavor is commonplace worldwide and therefore the development venture in developing nations is not any exemption. Price overwhelm is while not question one in all the most problems in building ventures. The explanations of improvement rate overwhelm might vary from nation to country due to the truth of the amendment in political, cash connected, social, and ecological conditions. This example is extra serious in developing nations wherever these overwhelm a moment surpasses (100%) of the wander planned expense. To dodge price overwhelm, the essential and very central advance is to acknowledge and build up the explanations and factors up to speed for price invades. Thus, this paper is planned to make up the various factors up to speed for improvement rate overwhelm in making international areas which is able to fill in because the manner by that ahead for future add adapting to those invades. This work could be a diagram of the clarifications of the speed invade in several building countries because it has been seen that {each} one in all the clarifications aren’t like each trip in developing worldwide are as albeit a number of them are regular like terrible administration, vacillation of texture prices incorrect texture gauges and monetary fame of the temp.

Total Factor Productivity and Gross Value Addition In Services Sector of India From 2000 To 2010
Authors: Dr. Manish Tongo <p style="text-align: justify;"Abstract:– Service sector of India has lately played a role of igniter so far as growth rate of GDP is concerned. Post globalization, Service sector alone emerged as a largest contributor to the GDP of India. Earlier by default an agrarian economy of India took a drastic turn and service sector today has occupied a driver’s seat in steering the growth of India economy. This divergence from a product economy to service economy has not only accelerated the growth of India but it certainly has elevated the brand image of India across the world. It won’t be out of place to mention here that before globalization of Indian economy the image of India in the eyes of the developed nation was far more negative than one could conceive of.

A Survey on Frequent Pattern Rules Techniques for Privacy Preserving mining
Authors: M.Tech. Scholar Shivani Pandey, Asst.Prof. Monali sahoo <p style="text-align: justify;"Abstract:– Information sharing among the organizations is a general movement in a few zones like business advancement and showcasing. As portion of the sensitive rules that should be kept private might be revealed and such revelation of sensitive patterns may impacts the benefits of the organization that possess the information. Consequently the principles which are sensitive must be covered before sharing the information. In this paper to provide secure data sharing sensitive rules are perturbed first which was found various techniques are discussed. Here techniques of privacy data mining was also detailed.

Automatic Irrigation System on Sensing Soil Moisture Content Using PV and GSM
Authors:Rohith Chilumula, Rahul Peddibhotla <p style="text-align: justify;"Abstract:-Agriculture and Gardening works are not trivial. There is a wide range of crops and plants and many varieties of each plant or crop. Various plants and crops have different requirements for water, fertilizers and sun. Soil ripeness for any strain or planting society is for the most part made a decision by the level of supplements and dampness in it. Various occasions and nursery workers are not ready to sustain the dirt with enough compost or water, while ordinarily just you do it.This undertaking is to encourage ranchers and nursery workers keep up control of the dirt dampness level

Speed Synchronization of DC Motors by Using Microcontroller
Authors: Karthik Reddy Solipuram, Nennuru Gopala Krishna Reddy <p style="text-align: justify;"Abstract:-In this venture, another control approach for ongoing pace synchronization of numerous enlistment motors amid speed increasing speed and load changes is produced. The control technique is to settle speed following of every motor while synchronizing its movement with other motors’ movements so differential speed blunders among various motors merge to zero. In industry many procedures required speed synchronization of more than one motors required all the while. Speed control of motor is imperative particularly in the fields including mechanical applications, apply autonomy, material factories, and so on. In all these application motor speed synchronization is animate in transport line driven by different motors. Sudden changes in load cause chasing and oscillatory conduct in DC machine. This conduct can be unsafe to the procedure. There are such a variety of strategies which is utilized for controlling the DC machines. Among all these strategy ace slave synchronization is a broadly utilized method. The ADC is accessible in microcontroller chip which make criticism circle. A driver circuit is utilized to drive the motor. In this strategy, the direction of motor’s speed is accomplished by changing the voltage of the motor which is balanced by the obligation cycle of PWM.

Facial Detection Based on K-means and Local Binary Patterns
Authors: M.Tech. M. Muralidhar Reddy, Asst.Prof. G.Varaprasad, Asst.Prof. Karamala Suresh <p style="text-align: justify;"Abstract:– Face detection and recognition are challenging research topics in the field of computer vision. Several algorithms have been proposed to solve lot of problems related to changes in environment and lighting conditions. In this research, we introduce a new algorithm for face identification or detection. The proposed method uses the well-known local binary patterns (LBP) algorithm and K-means clustering for face segmentation and maximum likelihood to classify output data. This method can be summarized as a process of detecting and recognizing faces on the basis of the distribution of feature vector amplitudes on six levels that is, three for positive vector amplitudes and three for negative amplitudes. Detection is conducted by classifying distribution values and deciding whether or not these values compose a face.

Security and Privacy Issues in EHR Systems Towards Trusted Services
Authors:M.E. G.Kavita, Asst.Prof. K.Bala <p style="text-align: justify;"Abstract:-As of late saw an across the board accessibility of electronic medicinal services information record (EHR) frameworks. EHR were produced during the time spent treatment in therapeutic focuses, for example, healing facilities, centers, different establishments. With the end goal to enhance the nature of social insurance benefit, EHR could be shared by an assortment of clients, so noteworthy security ought to be delivering to make EHR commonsense. EHR framework not totally settled out the protection challenges. In this paper an orderly writing survey was led the protection issue in EHR and make sense of the utilized security models. Additionally a novel Context-mindful Access Control Security Model (CARE) is proposed to catch the situation of information interoperability and bolster the security basics of human services frameworks alongside the capacity of giving fine-grained get to control.

Reverse Logistic Management in Construction
Authors:Mr. Anish Kanti Bera <p style="text-align: justify;"Abstract:-Reverse logistics in construction refers to the movement of products and materials from rescued buildings at a new construction site. Given the various facets of the reverse logistics series, there is a large number of studies, but there is no systematic review of literature on this important topic applicable to the construction industry. Therefore, the purpose of this study is to integrate the fragmented body of knowledge on reverse logistics in the construction, with the aim of promoting the concept among the stakeholders of the industry and the wider construction community. Through qualitative meta-analysis, the study synthesizes the findings of previous studies and presents some of the tasks required by industry stakeholders to promote this concept in the context of real interest. First, the research and terminology tendency for reverse logistics has been introduced. Second, it detects the main advantages and constraints of reverse logistics in construction, while providing some tips for using these benefits and reducing these constraints. Finally, it provides future research direction based on the review. Unlike the manufacturing context, due to inadequate RL literature in the construction sector, this could be one of the reasons. Consequently, the knowledge of RL and its applications in the construction sector is limited. To solve this issue, this study attempts to identify and highlight the fundamental aspects of the RL concept, which dramatically affects its adoption and implementation through an integrated review of literature.

Relationship between Stock Prices and Rupee Dollar Parity in India
Authors:Prof. Anchit Jhamb, Ms. Swati Aggarwal <p style="text-align: justify;"Abstract:-The dynamic linkage between rate of exchange and stock costs has been subjected to extensive analysis for over a decade and attracted goodly attention from researchers worldwide throughout the Asian crisis of 1997-98. The difficulty is additionally necessary from the perspective of recent massive cross-border movement of funds. In Bharat the difficulty is additionally gaining importance within the liberalization era. With this background, the current study examines the causative relationship between returns available market and forex market in Bharat. victimization daily knowledge from March 1993 to December 2002, so to found that causative link is mostly absent although in recent years there has been sturdy causative influence from exchange come back to forex market come back. The results, however, are tentative and that is the needfor any in-depth analysis to spot the causes and consequences of the findings.

Intelligent Spam Detection Microservice with Serverless Computing
Authors:Prachi Mahajan, Snehal Bhoite, Asst.Prof.Abhijit Karve <p style="text-align: justify;"Abstract:-Today for personal and business purpose most of the users uses email as one of the most important source for communication. The use of email is increasing day by day without being affected by alternative ways of communication such as social networking sites, SMS, mobile applications, electronic messages. As frauds using email classification is increasing due to extensive use of emails, it becomes very important issue to classify mails as fraud or normal mails. The Intelligent Spam Detection System (ISDS) provide automatic way to classify emails as SPAM i.e. fraud mail and HAM i.e. normal mail using multiple machine learning algorithms.

An Analysis of Land Use & Land Cover Mapping Using Geo- Spatial technology A case study of Pat watershed in Jhabua District, Madhya Pradesh, India
Authors:Chetan Singh Hada, Bahul Kumar Vyas, Jyoti Sarup, Dr.D.C. Gupta <p style="text-align: justify;"Abstract:-Remote Sensing (RS) as an immediate assistant to field, as of late assuming an essential job in the examination and assessed the regular asset in any piece of the world. Quickly changes in land utilize and arrive cover, arrive utilize are regularly thought to be indistinguishable, they are fairly very unique. Land cover might be characterized as the biophysical earth surface, while arrive utilize is frequently formed by human, financial and political effects on the land. Remote Sensing, coordinated with Geographical Information System (GIS), give and helpful instrument to examination of land utilize and arrive cover changes at a territorial level. The geospatial innovation of RS and GIS holds the potential for opportune and cost compelling assessment5 of common assets. The systems have been utilized extensively in the tropics for creating valuable data on backwoods cover, vegetation compose and arrive utilize changes. Accordingly, we have been utilized RS and GIS to contemplate arrive utilize arrive cover changes in Pat watershed, Jhabua region, Madhya Pradesh, India covering a region of around 817.93sq/km. In this view the present work has been taken up to contemplate and evaluate a portion of the regular assets and ecological capability of study territory which is falling in the Survey of India toposheets No: 46I/8, 46I/12, 46J/5 and 46J/9. Under this investigation three topical maps, for example, area delineate, guide, and land utilize and arrive cover maps were readied. The land utilize and arrive cover examination on the investigation region has been endeavored dependent on topical mapping of the zone comprising of developed land, Agricultural land, Forest, Waste land and Water bodies utilizing the satellite picture. The examination presumes that there is a fast development of developed region. Land utilize and arrive cover data, when utilized alongside data on other normal assets, similar to water, soil, hydro-geomorphology, and so on. will help in the best land utilize arranging at the full scale and small scale level.

A Review Article of Micro Grid Power Boosting Technique
Authors: M.Tech.Scholar Chandra Kant Sharma, Prof. Pragati Priya <p style="text-align: justify;"Abstract:-Energy, especially alternative source of energy is vital for the development of a country. In future, the world anticipates developing more of its solar resource potential as an alternative energy source to overcome the persistent shortages and unreliability of power supply. In order to maximize the power output the system components of the photovoltaic system should be optimized. For the optimization maximum power point tracking (MPPT) is a Promising technique that grid tie inverters, solar battery chargers and similar devices use to get the maximum possible power from one or more solar panels. Among the different methods used to track the maximum power point, Perturb and Observe method is a type of strategy to optimize the power output of an array. In this method, the controller adjusts the voltage by a small amount from the array and measures power, if the power increases, further adjustments in that direction are tried until power no longer increases. In this research paper the system performance is optimized by perturbs and observes method using buck boost converter. By varying the duty cycle of the buck boost converter, the source impedance can be matched to adjust the load impedance to improve the efficiency of the system. The Performance has been studied by the MATLAB/Simulink.

Water Quality mapping using Remote Sensing and GIS- a case study of Pampawa Watershed in Jhabua Dist, Madhya Pradesh, India
Authors:Chetan Singh Hada, Bahul Kumar Vyas, Jyoti Sarup, Dr.D.C. Gupta <p style="text-align: justify;"Abstract:-Groundwater serves as the main sources of water in the urban environment, which is used for drinking, industrial and domestic purposes and often, it is over exploited. Now these days, the groundwater is facing terrorization due to anthropogenic activities. In this study, groundwater sample collects in pre and post monsoon seasons for total 56 villages from 61 predetermined bore wells and open wells representing Pampawa Watershed in Jhabua District, Madhya Pradesh, India. The water samples analyzed for physico-chemical parameters like TDS, Chloride, Fluoride, pH, Hardness and Turbidity using standard techniques in the laboratory. Also, geographic information system-based groundwater quality mapping in the form of visually communicating contour maps was developed using ArcGIS-version 10.1 software to delineate spatial distribution in physicochemical characteristics of groundwater samples.

Removal of Calcium and Magnesium ions from Ground Water in the Jaffna Peninsula, Sri Lanka by using Chemically Modified RiceHusk
Authors:Anushkkaran. P, Asharp Sharmec. G, Mazenod Denorth .V.A <p style="text-align: justify;"Abstract:-Water hardness due to the calcium and magnesium ions in groundwater have been a devastating effect to the freshwater in the dry zone in Sri Lanka, mainly in the Jaffna peninsula. Jaffna peninsula depends on ground water as there are no other freshwater sources and the rain fall is not sufficient. Study on the removal of calcium and magnesium ions using rice husk (RH) as a low-cost adsorbent was investigated in groundwater of calcium and magnesium ions in the Jaffna peninsula, Sri Lanka. This study was conducted to evaluate the ground water quality and removal of hardness. Fifteen wells were selected in different regions in Jaffna peninsula for ground water samplings. Impact of operational conditions, such as the dosage amount of rice husk and contact time were analyzed for chemically modified rice husk. Here we used HCl and NaOH to modifying the rich husk by chemically at pH 4 and pH 8 respectively. Statistical analysis revealed that the highest removal of Ca2+ and Mg2+ ions are at pH 4, adsorbent dosage 10 g/L water and 30 minutes of settling time.

Solar PV – Battery Storage with DVR for Power Quality Enhancement

Authors:K.Venkatrami Reddy, P. Chinnaiah, V.Sunil Kumar Reddy
<p style="text-align: justify;"Abstract:-The consumption of electric power is very high due to high invention and more number of non-linear loads. The most of the loads are nonlinear loads, causes the harmonic electric currents in the system. These harmonic currents in turn create system resonance, capacitor overloading, decrease in efficiency, voltage magnitude changes. Power quality has become an increasing concern to utilities and consumers. The power transmitting in a distribution line is needed to be of very high quality. One of the major power quality issues is considered in the distribution system called Voltage sag and it can mitigate with the help of dynamic voltage restorer (DVR). In this paper, Focusing on the new integration of solar PV-Battery based Dynamic Voltage Restorer is implementing in the distribution system to meet the required power and for power quality enhancement. Solar photovoltaic with boost converter is implemented by incremental conductance method to track the Maximum power. The performance of solar photovoltaic, Battery with Dynamic Voltage restorer is simulated under dynamic conditions of the load in MATLAB-SIMULINK software.

The API Architect’s Playbook: Designing Scalable Integration Solutions For Salesforce

Authors: Kirandeep Kaur

Abstract: In the rapidly evolving landscape of enterprise applications, Salesforce has emerged as a dominant force in customer relationship management, empowering organizations to streamline business processes and strengthen customer engagement. However, to fully realize the potential of Salesforce, enterprises need scalable and reliable integration strategies that ensure seamless connectivity with enterprise resource planning systems, third-party applications, and cloud-native platforms. This requires the expertise of an API architect who can design robust integration frameworks to enhance performance, ensure data consistency, and maintain system resilience. The concept of scalability is central in modern integration, ensuring that while organizations respond to growing workloads and increasingly diverse application ecosystems, the underlying architecture remains efficient and sustainable. An API architect’s playbook structures these practices into actionable principles that guide decision-making, promote system agility, and guarantee optimized business outcomes. The key dimensions of such a framework include API design principles, security and compliance enforcement, governance models, lifecycle management, and performance optimization. Scalability must be embedded at every layer, from defining granular microservices to leveraging event-driven communication strategies and middleware orchestrations. Moreover, salesforce-centered integrations must address unique features such as metadata extensibility, multi-tenant isolation, and API limits, demanding both technical ingenuity and strategic foresight from architects. By leveraging reusable API patterns, standardization approaches, and secure service interfaces, organizations can reduce operational overhead and prepare for long-term interoperability. The playbook outlined here provides a comprehensive exploration of these concepts, illustrating the critical role of the API architect as both a strategist and technical designer. Through a structured approach encompassing design fundamentals, security, lifecycle management, governance, and real-world deployment strategies, organizations can advance their Salesforce integration journey with resilience and foresight. The convergence of scalability with adaptability and future-ready architecture ensures that Salesforce integrations not only meet immediate organizational needs but also support continuous growth across dynamic markets.

DOI: http://doi.org/10.5281/zenodo.17277508

The Connected Enterprise: Using REST And SOAP APIs To Unify Your Business Systems

Authors: Amandeep Kaur

Abstract: In the digital economy, enterprises succeed by leveraging seamless connectivity across their diverse business systems. As organizations adopt cloud applications, third-party tools, and modern enterprise platforms, the challenge of interoperability becomes a central concern. REST and SOAP APIs have emerged as fundamental technologies enabling structured, scalable, and reliable integration across disparate systems, empowering businesses to create a truly connected enterprise. REST APIs focus on lightweight, stateless, and web-centric design, which makes them ideal for modern applications needing rapid deployment and mobile accessibility. SOAP APIs, on the other hand, bring rigid standards and enhanced security features, making them indispensable for mission-critical systems in finance, healthcare, and government. By unifying these approaches, enterprises can maximize existing investments in legacy systems while embracing agility and innovation in newer platforms. The concept of a connected enterprise transforms organizations into ecosystems where data flows continuously between customer-facing interfaces, back-end systems, databases, and third-party applications. Such connectivity eliminates silos, enhances operational efficiency, and provides leaders with real-time insights for decision-making. Furthermore, APIs enable digital transformation by supporting cloud migrations, process automation, and integration with emerging technologies such as machine learning and IoT. The orchestration of REST and SOAP APIs helps organizations streamline workflows, create consistency in customer experiences, and reduce bottlenecks in information exchange. This paper explores the role of REST and SOAP APIs in creating a connected enterprise. It highlights the advantages of both frameworks, explains strategies for integration, and discusses how hybrid API environments unlock new business value. Real-world use cases demonstrate how enterprises in sectors such as finance, logistics, retail, and healthcare are using APIs to merge traditional architectures with modern digital ecosystems. The discussion further explores governance, scalability, and security considerations that enable enterprises to future-proof their integration strategies. Ultimately, building a connected enterprise using REST and SOAP APIs is not just a technological upgrade but a competitive imperative for organizations that aim to foster agility, resilience, and innovation in a highly dynamic business environment.

DOI: http://doi.org/10.5281/zenodo.17277559

Salesforce And The External World: A Deep Dive Into API-Driven Data Synchronization

Authors: Jagtar Singh

Abstract: In today’s digital enterprise landscape, data interoperability has emerged as the backbone of organizational growth and operational intelligence. Businesses increasingly rely on diverse applications to manage customer engagement, enterprise resource planning, financial operations, marketing automation, and human capital management. Salesforce, as the world’s leading customer relationship management (CRM) platform, sits at the center of these processes, where the ability to connect seamlessly with external systems directly impacts customer satisfaction and organizational agility. The mechanism enabling this connectivity is application programming interfaces (APIs), which have evolved into powerful enablers of cross-application communication for real-time data synchronization. API-driven synchronization ensures that Salesforce data remains consistent with external systems such as ERP software, data warehouses, payment gateways, cloud-native applications, healthcare systems, and IoT platforms. This enhances data visibility and eliminates silos, enabling end-to-end automation and improved analytics. Moreover, APIs allow enterprises to adopt a modular approach where technology environments evolve without legacy bottlenecks. As organizations transition toward hybrid and cloud-centric infrastructures, the reliance on standardized, secure, and scalable data flow through APIs becomes even more critical. This evolution is deeply connected to business continuity, regulatory compliance, and customer experience, making Salesforce integrations a strategic necessity rather than a technical utility. This article explores Salesforce API-driven data synchronization in depth, focusing on how it bridges Salesforce with the external world. It examines integration types, synchronization frameworks, security, governance, real-world use cases, challenges, and strategic recommendations for enterprises. By analyzing these dimensions, the article provides insights into harnessing Salesforce APIs not just for operational integration, but for delivering intelligent, adaptive, and future-ready digital ecosystems. Keywords framing this discussion include Salesforce integration, API synchronization, enterprise interoperability, data orchestration, and real-time connectivity.

DOI: http://doi.org/10.5281/zenodo.17277623

The influence of IoT and AI convergence on industrial automation ecosystems

Authors: Nisha Choudhury

Abstract: The convergence of the Internet of Things (IoT) and Artificial Intelligence (AI) is revolutionizing industrial automation ecosystems. This integration is driving the transformation of traditional manufacturing and production processes into smart, connected systems that optimize operations, enhance productivity, and facilitate predictive maintenance. IoT facilitates the collection and communication of vast amounts of real-time data from sensors, devices, and machinery across an industrial environment. Coupling this with AI’s ability to analyze complex data patterns, predict outcomes, and automate decision-making processes creates unprecedented opportunities for efficiency and innovation in industrial automation. The synergy between IoT and AI fosters smarter factories, where machines self-optimize, autonomous robots collaborate with human workers, and supply chain logistics are seamlessly managed. This article explores how this dynamic convergence alters industrial workflow, impacts operational resilience, improves safety standards, and cultivates new business models. It also examines the challenges related to cybersecurity, data privacy, and system integration that accompany this digital evolution. Ultimately, the IoT-AI alliance is not only reshaping industrial automation with smarter, more adaptive ecosystems but also paving the way for Industry 4.0 and beyond, ushering in a new era where intelligent systems drive industrial competitiveness and sustainability.

DOI: https://doi.org/10.5281/zenodo.17776309

 

The influence of serverless computing models on enterprise cost optimization

Authors: Akash Gupta

Abstract: Serverless computing has emerged as a transformative paradigm in enterprise IT, fundamentally altering how organizations approach infrastructure management and cost efficiency. This model abstracts away the complexities of server administration, allowing enterprises to focus solely on developing and deploying applications without the burden of maintaining underlying hardware. By leveraging a pay-as-you-go pricing mechanism, serverless computing enables significant cost reductions, as organizations are billed exclusively for the actual resources consumed during code execution rather than for idle infrastructure. Such dynamic resource allocation facilitates optimal usage, particularly beneficial for companies with unpredictable or fluctuating workloads. Furthermore, serverless architectures streamline development workflows by eliminating the need for manual scaling and capacity planning, resulting in improved operational agility and innovation throughput. Nonetheless, enterprises must be mindful of potential challenges, including workload-specific cost unpredictability, cold start latencies, and vendor lock-in. The strategic adoption of serverless models, underpinned by comprehensive cost monitoring and optimizations, can unlock substantial benefits in terms of both direct and indirect cost savings, agility, and overall IT modernization. As this article will demonstrate, serverless computing stands at the intersection of technological advancement and cost optimization—reshaping enterprise architectures for the demands of today’s digital economy.

DOI: https://doi.org/10.5281/zenodo.17776418

The Impact Of AI-based Threat Intelligence On Proactive Cybersecurity Management

Authors: Dipesh Adhikari

Abstract: Artificial Intelligence (AI) has rapidly evolved as a cornerstone technology in defending modern digital infrastructures. The exponential rise in cyber-attacks, ranging from state-sponsored espionage to ransomware, has pushed organizations to adopt intelligent systems that can predict, detect, and mitigate threats in real time. Traditionally, cybersecurity has been reactive, relying on predefined rules and manual incident response. However, the volume and sophistication of cyber threats now exceed human capacity for timely detection. AI-based threat intelligence represents a paradigm shift toward a proactive approach, empowering analysts with predictive capabilities and context-aware automation. It integrates machine learning, natural language processing, and behavioral analytics to derive actionable intelligence from vast and diverse data sources, such as network traffic, system logs, dark web forums, and social platforms. This intelligence can forecast potential attack vectors, identify anomalies, and optimize defense mechanisms before a breach occurs. The deployment of AI in threat intelligence enhances the precision of anomaly detection, improves situational awareness, and enables dynamic risk assessment. AI-driven systems continuously learn from data, adapting to new threat patterns and minimizing false positives. However, challenges remain, including algorithmic bias, adversarial attacks, data privacy, and the dependence on high-quality labeled datasets. Moreover, the integration of AI into cybersecurity ecosystems demands proper governance structures, skilled professionals, and regulatory alignment to prevent misuse or overreliance. This article explores the role of AI-based threat intelligence in advancing proactive cybersecurity management. It critically examines technological foundations, integration models, challenges, and future trajectories. By combining technological insights with strategic perspectives, this work aims to provide a holistic understanding of how intelligent systems are transforming threat prediction, detection, and response across sectors. As global digitalization intensifies, leveraging AI-based intelligence for proactive security management will become an indispensable necessity rather than a strategic option.

DOI: http://doi.org/10.5281/zenodo.17840010

The Impact Of AI-driven Orchestration On Resource Utilization In Hybrid Cloud Platforms

Authors: Rashmi K. Nair

Abstract: Artificial intelligence (AI) has rapidly become the linchpin for modern cloud management, especially in the orchestration of hybrid cloud environments that span both public and private infrastructures. AI-driven orchestration leverages advanced algorithms, including machine learning and predictive analytics, to transform traditional, manually operated workflows into dynamically optimized, autonomous cloud ecosystems. This paradigm shift addresses persistent challenges such as operational complexity, resource inefficiency, and the need for real-time decision-making. By intelligently automating workload distribution, scaling resources predictively, enhancing security through anomaly detection, and enabling self-healing of cloud infrastructure, AI fundamentally redefines resource utilization across hybrid cloud platforms. Organizations adopting AI-driven orchestration experience not only improved performance and reduced costs but also increased responsiveness and operational reliability. Through continuous analysis of historical and real-time data, AI delivers actionable insights for optimal resource allocation, reduces human error, and positions businesses to respond proactively to fluctuating demands and evolving threats in the cloud. This article delves into the mechanisms and impacts of AI-powered orchestration, exploring its transformative potential for efficiency, scalability, and security in heterogeneous cloud environments. Key implementation strategies, challenges, and future directions are examined, illustrating how AI-driven orchestration is shaping the future of cloud computing for enterprises worldwide.

DOI: http://doi.org/10.5281/zenodo.17840060

Toward Self-Optimizing Enterprise Data Pipelines: AI-Assisted Performance Tuning for PL/SQL and Informatica Workflows

Authors: Srujana Parepalli

Abstract: Performance optimization of enterprise data pipelines has traditionally relied on rule-based heuristics, manual tuning cycles, and the accumulated intuition of experienced practitioners; however, as data volumes scale into terabytes and petabytes, workloads become increasingly heterogeneous, and execution environments span databases, ETL engines, and distributed infrastructure, these approaches struggle to deliver consistent and timely results. This paper presents an AI-assisted performance tuning framework for PL/SQL execution environments and Informatica PowerCenter workflows that augments established database performance metrics such as execution plans, wait events, resource utilization, and ETL session statistics with machine-learning-driven optimization techniques capable of learning from historical workload behavior. Building on foundational research in automatic database tuning, self-managing and autonomic systems, and ETL performance engineering, the proposed architecture continuously correlates workload characteristics, configuration parameters, and observed performance outcomes to generate data-driven recommendations for optimal SQL execution strategies, memory and session configurations, partitioning schemes, and workflow design patterns. By synthesizing academic research and industry practices published between 2000 and 2017, the study illustrates how AI-based optimization complements traditional tuning methods by reducing manual intervention, improving adaptability to changing data patterns, and delivering measurable improvements in throughput, latency, and operational stability across large-scale enterprise data platforms.

Optimizing CI/CD Pipelines For Scalable Enterprise Cloud Applications: Architecture, Automation, And Deployment Strategies

Authors: Shekar Vollem

Abstract: Enterprise cloud applications are increasingly required to support rapid software delivery, continuous updates, and highly reliable deployment cycles in order to meet the growing demands of digital transformation, global scalability, and user expectations for uninterrupted services. Continuous Integration and Continuous Delivery (CI/CD) pipelines have emerged as critical infrastructure components that enable automated building, testing, and deployment of applications in modern DevOps environments. These pipelines integrate development, testing, and operational workflows, allowing software changes to be validated and deployed in a consistent and repeatable manner. However, large-scale enterprise systems face significant challenges in optimizing CI/CD pipelines due to complex application architectures, distributed development teams, microservice dependencies, heterogeneous cloud infrastructures, and stringent compliance or security requirements. Inefficient pipelines can introduce bottlenecks in build processes, increase testing overhead, and slow down deployment cycles, thereby affecting overall software delivery performance. This paper explores strategies for optimizing CI/CD pipelines in enterprise cloud environments, focusing on automation frameworks, pipeline orchestration mechanisms, intelligent test management, infrastructure-as-code practices, and scalable deployment models that support cloud-native architectures. By analyzing existing research studies, DevOps methodologies, and industry practices, the study highlights architectural patterns, deployment pipeline designs, and continuous engineering principles that enhance the efficiency, scalability, and reliability of software delivery systems. The findings demonstrate that optimized CI/CD pipelines significantly improve release velocity, enable faster feedback loops for developers, reduce operational risks associated with manual deployments, and support scalable cloud-native application development while maintaining high standards of software quality and system stability.

DOI: https://doi.org/10.5281/zenodo.19208630

Published by:
× How can I help you?