- Mathematics
Author Archives: vikaspatanker
Mechanical
UncategorizedMechanical
Electrical
UncategorizedElectrical
Electronics
Uncategorized- Electronics
Computer Science
UncategorizedLatest Research Topic for computer science are:
Data Mining include analysis of large amount of unorganized data in form of text files, image, tabular data, etc. Here further classification of this is done by working in specific area of research such as
-
Web Mining:
Here website related information like page content optimization can be done by using its features of web log, web content, web structure.
Web Page Prediction: This comes under web mining where web log is use for understanding the user behavior on the website, in this step page content was also used. Some algorithm like Ant colony, markov modal, etc are use for the same.
Web Page Ranking: In this work website various pages are analyzed for ranking the pages of the site by using methods of google rank, linear rank, page rank etc.
-
Text Mining:
Here documents are either arrange, summarized, fetch, etc by using pattern or Term feature.
Content Retrieval / Document Retrieval / Information Retrieval: In this work text files are either arrange in specific order OR fetch list of files based on the query of user.
-
Temporal Mining:
Here analysis done on the basis of time stamp where various information are summarized as per there happening and there causes.
Event Activity Happening: In this work events are proposed with there probability where chance of data going was done.
Nature Prediction: Large amount of information gather from the satellite images for predicting the glacier movements, galaxy analysis.
Dataset Available for Research in Computer Science are
- Adult Dataset (code 101): for pattern recognition, privacy preserving mining, Description: It contain 32560 number of session with 11 fields where attribute contain both number and textual data. To Get Data / Download Data Send Request
- Information Retrieval Dataset (code 102): Its an collection of images which contain 1000 images of ten category like people, elephant, bus, etc. where each subcategory have 100 images. This dataset is use for image retrieval, fetching, clustering, etc. To Get Data / Download Data Send Request
- Text File Classification (code 103): In this dataset small set of text files are collect where 1000 text file contain article on hockey different worldcup debate. This dataset is used for text mining such as classification of text files, fetching of relevant document as per user query, conclusion / abstract creation, disputant identification, etc. To Get Data / Download Data Send Request
- Tax Relation Dataset (code 104): In this dataset tax payer relations are present with company details, transaction between them, individual relations, here dataset is used in generating association rules, Tax evasion identification, Transaction identification for business, etc. To Get Data / Download Data Send Request
- Brain Tumor Segmentation Dataset (code 105): In this dataset 50 images with there ground truth were present, so dataset has 100 images. Image have skull, brain and tumor section. Hence scholar can segment image into three class. To Get Data / Download Data Send Request
- Jaffe Images (code 106): In order to study the expression of the face this set of dataset is very helpful where seven emotion are expressed by 11 Japaneses girls. Here this dataset used for image fragmentation, expression recognition, face recognition, etc. To Get Data / Download Data Send Request
- Chess Dataset (code 107): This dataset is collection of 76 items where 3196 number of transactions are present, its an set of numeric id of the frequent items purchase by the store. This is used for Association Rule Mining, Privacy Preserving, FP-Tree, etc. To Get Data / Download Data Send Request
- Image Watermarking (code 108): in this dataset standard set of images are present both in color and gray format, with two size 256×256 and 512×512 name of those images are mandrilla, lena, etc. This is used for image watermarking, cryptography, encryption, decryption, etc. To Get Data / Download Data Send Request
- Character Recognition (code 109): Its an collection of images for identifying the character present in the hand expression used by dumb people. This is used in image processing for shape identification. To Get Data / Download Data Send Request
- Object Detection (code 110): In this dataset few set of video are present where people perform various action such as running, walking, jumping, etc. To Get Data / Download Data Send Request
- Web Log (code 111): In this dataset web log of famous nasa site was present with there complete path of various random surfer, it contain 10000 sessions in the dataset in text file. To Get Data / Download Data Send Request
- Twitter Sentiment (code 112): Here one can get sentiment of the tweets done by the user on different emotions. This can be used in sentiment analysis, classification, emotion identification, etc. To Get Data / Download Data Send Request
- Geosptial Tagging (code 113): Here as per the geographical co-ordination various set of image are tag by specifying its longitude and latitude value. This is used in geographical location based learning, etc. To Get Data / Download Data Send Request
- SAR Image (Code 114): In this dataset set of SAR image is present with different time span for the study of ice / snow precipitation, melting rate, etc. Here researcher can use this for segmentation, rate identification, etc. To Get Data / Download Data Send Request
- Wisconsin Breast Cancer (code 1014): Its an numeric value dataset used for identifying the cluster of data which tends towards breast cancer. This dataset is implement for pattern recognition, binary classification testing techniques, etc. To Get Data / Download Data Send Request
IJSRET Volume 3 Issue 6, November-2017
Archive Volume 3 Issue 6An Approach for Trusted Computing of Load Balancing in Cloud Environment
Authors: Manoj Kumar Selkare, Vimal Shukla
Abstract: Cloud computing is a novel approach in order to use the resource of computing where these resources may be hardware or software. This facility is delivered as a service in the communication network. This facility known as cloud, which occurred from the use of a service as a cloud, which is an abstraction for the complex infrastructure system containing diagrams. Services of cloud computing involve trusted remote user data, and computer software. This paper proposed to an approach to efficient load balancing in cloud computing
Low Power Three Input XOR Gate For Arithmetic And Logical Operation
Authors:Ms. Poojashree Sahu, Mr. Ashish Raghuwanshi
Abstract: With advancement of microelectronics technology scaling, the main objective of design i.e. low power consumption can be easily acquired. For any digital logic design the power consumption depends on; Supply voltage, number of transistors incorporated in circuit and scaling ratios of the same. As CMOS technology supports inversion logic designs; NAND & NOR structures are useful for converting any logic equation into physical level design that comprises of PMOS and NMOS transistors. In similar way, logic can be implemented in other styles as well, with the difference in number of transistors required. The conventional CMOS design for three input XOR logic can be possible with 10 or more than 10 transistors, with the methodology discussed in this paper, the same design for three inputs XOR logic can be made possible with 16 transistors. The proposed methodology consists of transmission gate and systematic cell design methodology (SCDM). This design consumes 45% (35%) less power dissipation than that of conventional LPHS-FA and SCDM based XO10 XOR logic design with CMOS technology. Since the design for XOR logic, is useful for variety of applications such as Data encryption, Arithmetic circuits, Binary to Gray encoding etc. the XOR logic has been selected for design. The design explained in this paper is simulated with 130nm technology.
Low Read Power Delay Product Based Differential Eight Transistor SRAM cell
Authors: Ms. Jaya Sahu,Mr. Ashish Raghuwanshi
Abstract: SRAM is designed to provide an temporary storage for Central Processing Unit and replace Dynamic systems that require very low power consumption. Low power SRAM design is critical aspect since it takes a large fraction of total power and die area in high performance processors. This paper include the work on eight transistor SRAM cell that of smaller read power delay product due to cascading of pull offers 28% (74%) smaller read ‘0’ (‘1’) than exiting 7T. The SRAM cell read and cycle is characterized at 45nm technology using SPICE EDA tool.
A Robust Classification Algorithm for Multiple Type of Dataset
Authors:M.Tech. Scholar Afshan Idrees, Prof. Avinash Sharma
Abstract: With the increase in different internet services number of users are also increasing. Although while taking service user may be on risk for sharing data. So this work focus on increasing the security of the user data while taking classification service. Here algorithm provide robustness by encrypting the data and send to server, while server classify the data in encrypted form. One more security issue is that instead of transferring whole encrypted data, features are extract from the data first then encrypt and send to server for classification. Here proposed work successfully classify all type of user data in form of text, image, numeric.
An Unsupervised TLBO Based Drought Prediction By Utilizing Various Features
Authors:M.Tech. Scholar Shikha Ranjan Patel, Prof. Priyanka Verma
Abstract: Agricultural vulnerability is generally referred to as the degree to which agricultural systems are likely to experience harm due to a stress. In this work, an existing analytical method to quantify vulnerability was adopted to assess the magnitude as well as the spatial pattern of agricultural vulnerability to varying drought conditions. Based on the standardized precipitation index (SPI) was used as a measure of drought severity. A number of features including normalized difference vegetation index (NDVI), vegetation condition index (VCI), and SPI will be use for classification. Here proposed modal use Teacher Learning Based Optimization genetic approach for classify the different location present in geospatial dataset. By use of this TLBO approach prior knowledge is not required. Experiment results shows that proposed work is better as compare to previous work.
Optimizing Material Management through Advanced System Integration, Control Bus, and Scalable Architecture
Authors:RamaKrishna Manchana
Abstract:This paper presents an advanced approach to material management by leveraging modern system integration techniques and scalable microservices architecture. The proposed solution addresses the limitations of traditional monolithic systems by introducing microservices, control bus mechanisms, and event-driven designs that enhance operational efficiency and scalability. By utilizing advanced integration patterns, including synchronous and asynchronous services, the system improves real-time data processing and decision-making capabilities. This document outlines the technical architecture, key components, integration designs, and implementation strategies that underpin a robust and adaptable material management system, demonstrating significant improvements in scalability, performance, and responsiveness.
DOI: 10.61137/ijsret.vol.3.issue6.200

Optimizing Healthcare Data Warehouses for Future Scalability: Big Data and Cloud Strategies
Authors:Srinivasa Chakravarthy Seethala
Abstract:Healthcare organizations generate vast amounts of data, driven by regulatory compliance, patient care needs, and advances in medical technology. Legacy data warehouses, while central to healthcare data management, often struggle to accommodate escalating data volumes, new data types, and real-time processing demands. This paper presents strategic insights into leveraging Big Data and cloud computing to modernize healthcare data warehouses for future scalability. We examine technical approaches, review cloud and Big Data integration techniques, and propose a roadmap for healthcare data scalability, addressing concerns of security, compliance, and data interoperability.
DOI: 10.61137/ijsret.vol.3.issue6.201

Driving Business Decisions With Data: A Practical Framework For Successful Power BI Adoption
Authors: Anjali Thomas
Abstract: In today’s competitive business landscape, data-driven decision-making has become a strategic imperative. Organizations are increasingly turning to business intelligence (BI) platforms to transform raw data into actionable insights that guide growth, efficiency, and innovation. Among these platforms, Power BI stands out as a versatile solution that bridges the gap between technical complexity and user accessibility. This review article presents a comprehensive framework for successful Power BI adoption, emphasizing the interplay between governance, integration, scalability, and organizational readiness. The paper begins by outlining the challenges enterprises face when shifting from intuition-based management to data-centric practices, highlighting issues of data silos, inconsistent reporting, and resistance to cultural change. It then explores how Power BI’s architecture—spanning ETL processes, SQL integration, cloud deployment, and security mechanisms—can serve as the backbone for a sustainable BI strategy. The review further examines practical use cases across industries, DevOps-driven automation, and the role of training programs in fostering a self-service analytics culture. Through a critical discussion of opportunities and limitations, the article underscores that successful Power BI adoption requires more than technology; it demands alignment between people, processes, and platforms. By providing a structured roadmap, this study offers organizations a pragmatic guide to embedding Power BI within their BI lifecycle. The conclusion reaffirms that Power BI is not simply a reporting tool but a catalyst for building data-driven cultures that enhance agility, competitiveness, and long-term decision-making excellence.
DOI: https://doi.org/10.5281/zenodo.17275574
Creating A Single Source Of Truth: Data Governance With Power BI, SQL, And Effective ETL Processes
Authors: Vivek Sharma
Abstract: In contemporary enterprises, data fragmentation across multiple systems, departments, and formats poses significant challenges to decision-making, reporting accuracy, and operational efficiency. A Single Source of Truth (SSOT) addresses these challenges by consolidating heterogeneous data into a centralized, authoritative repository. This review examines the implementation of SSOT using SQL databases, robust ETL pipelines, and Power BI for visualization and governance. It explores the principles of data governance, including data ownership, quality control, role-based security, and regulatory compliance, emphasizing their critical role in maintaining data integrity and trustworthiness. The review also details best practices for relational database design, performance optimization, and ETL automation to ensure timely and accurate data delivery. Case studies across healthcare, financial services, and retail illustrate practical applications, demonstrating improved reporting efficiency, operational responsiveness, and decision-making capabilities. Furthermore, the integration of SSOT across enterprise workflows, combined with monitoring, audit trails, and automated alerts, underscores the value of a governed, centralized data ecosystem. The article highlights current challenges, including system complexity, adoption barriers, and legacy integration, and offers strategies for mitigation. Looking forward, emerging trends such as cloud-native architectures, real-time streaming, AI-enhanced analytics, and hybrid or federated data models suggest new avenues for enhancing SSOT utility and scalability. By providing a comprehensive framework, this review underscores the strategic, operational, and compliance benefits of SSOT, positioning it as a cornerstone for modern, data-driven enterprises seeking reliability, agility, and insight-driven decision-making.
DOI: https://doi.org/10.5281/zenodo.17275623
Tableau’s Secret Sauce: Leveraging RHEL And Centos For High-Performance Data Visualization
Authors: Kavya Menon
Abstract: Modern enterprises increasingly rely on business intelligence (BI) platforms to transform raw data into actionable insights. Tableau, as a leading BI tool, offers sophisticated visualization, analytics, and reporting capabilities. However, the underlying operating environment significantly impacts performance, scalability, security, and cost efficiency. This review explores the strategic advantages of deploying Tableau on Linux-based systems, specifically Red Hat Enterprise Linux (RHEL) and CentOS, for enterprise-grade BI implementations. It examines the role of Linux in enhancing system stability, providing robust security frameworks, supporting modular and automated workflows, and enabling high availability and scalability. The article analyzes data integration strategies, ETL pipelines, and dashboard optimization practices tailored to Linux environments, emphasizing both operational efficiency and user experience. Case studies across healthcare, finance, and retail illustrate real-world applications, demonstrating how Linux-based Tableau deployments support secure, high-performance analytics, regulatory compliance, and business agility. Furthermore, the review addresses monitoring, maintenance, and performance tuning, highlighting best practices for sustained system reliability. Future trends, including AI integration, real-time streaming, hybrid cloud architectures, and advanced automation, are discussed to illustrate the evolving landscape of enterprise BI. By combining Tableau’s visualization capabilities with Linux’s reliability and flexibility, organizations can achieve cost-effective, scalable, and secure BI solutions. This article underscores the importance of selecting an appropriate operating environment to maximize Tableau’s potential and provides a comprehensive guide for IT professionals, analysts, and business leaders seeking to optimize their BI infrastructure.
DOI: https://doi.org/10.5281/zenodo.17275816
Migrating Legacy Information Management Systems To AWS And GCP: Challenges, Hybrid Strategies, And A Dual-Cloud Readiness Playbook
Authors: Sudhir Vishnubhatla
Abstract: Legacy Information Management Systems (IMS) remain central to operations in banking, healthcare, public sector, and media, yet their monolithic design, proprietary data formats, and brittle integrations have become barriers to agility and intelligent analytics. Earlier literature identified the persistent costs and risks of legacy IMS and proposed incremental modernization through wrapping, service extraction, and reengineering, while empirical studies in the early 2010s established the feasibility and business value of infrastructure-as-a-service (IaaS) rehosting. As of late 2017, however, the migration decision space has widened: enterprises are not merely choosing whether to move to the cloud, but how to distribute workloads across multiple hyperscalers, primarily Amazon Web Services (AWS) and Google Cloud Platform (GCP). This article synthesizes more than a decade of academic and industrial work on legacy migration and proposes a Dual-Cloud Readiness Playbook tailored to IMS modernization. The playbook comprises a readiness scorecard, a five-phase migration lifecycle, and a layered hybrid architecture that balances compliance, cost, and capability while reducing vendor lock-in. The result is a pragmatic framework that aligns with the realities of petabyte-scale content archives, metadata-heavy workflows, and emerging regulatory constraints, offering a credible path from monolithic legacy platforms to modern, cloud-native information management
The Impact Of Strategic Stress Management On Employee Retention In High-Pressure Global Service Sectors
Authors: Dipikaben Solanki
Abstract: The world service industry has had a psychological strain on it due to economic unpredictability, technological growth and the augmented performance standards. The banking, information technology and healthcare industries are some of the high-pressure sectors where cases of burnout and high employee turnover have been reported, posing serious organisational and economic problems. This study has reviewed how strategic management of stress influences retention of employees in these industries using a comparative cross-national study. By assuming an interpretivist and an inductive philosophical approach, the research has employed secondary qualitative data in order to draw comparisons among the prevalence of stress, turnover patterns, and organisational reactions to these patterns in the various national settings. The results have shown that there is a close correlation between stress at work and turnover. It has been demonstrated that emotional exhaustion, heavy workload and lack of managerial support have decreased organisational commitment and out-of-organisation intentions. Nevertheless, organized stress management solutions such as supportive leadership behaviour and workload modification have resulted in a positive change in short-term retention. The cross-national differences have also brought out the fact that organisational design, economic stress and sectoral characteristics determine the intensity of burnout and workforce stability. The analysis has found that sustainable management of human resources should consider including mental health strategies as an organisational priority and not a response welfare intervention. Some policy implications are stress audits, inclusion of mental health performance indicators and creation of crisis responsive HR structures. Strategic stress management has hence gained prominence as an economic stability mechanism and a sustainability tool to the workforce that is in the global service industries.
IJSRET Volume 3 Issue 5, September-2017
Volume 3 Issue 5MANET, Evolutionary Algorithm, Email System, encryption scheme, journal entries, international journal of current research
Secret Sharing Schemes over MANET to Avoid Cheater Participation [102-109]
Author: Nisha Bharti, Hansa Acharya

Web URL Classification and Malicious Activities: A Review [110-115]
Malicious Web URL Classification using Evolutionary Algorithm [116-199]
Attack over Email System: Review [200-206]
Author: Anuradha Kumari, Nitin Agrawal, Umesh Lilhore

Encryption Scheme for Mobile Ad Hoc Networks: A Survey [207-209]
Author: Neha Dwivedi, Dr. Rajesh Shukla

Evolutionary Algorithm Based Optimized Encryption Scheme for Mobile Ad-Hoc Network [210-216]
Author: Neha Dwivedi, Dr. Rajesh Shukla

MANET, Evolutionary Algorithm, Email System, encryption scheme, journal entries, international journal of current research
MANET, Evolutionary Algorithm, Email System, encryption scheme, journal entries, international journal of current research
Power BI’s Role In The BI Lifecycle: A Complete Guide To Implementation, Development, And Maintenance
Authors: Joseph Fernandes
Abstract: Power BI has established itself as a versatile and comprehensive platform for the business intelligence (BI) lifecycle, supporting data integration, development, visualization, collaboration, and ongoing maintenance. This review article examines Power BI’s capabilities in consolidating heterogeneous data sources, performing robust ETL transformations, and delivering interactive dashboards that provide actionable insights for enterprise decision-making. The discussion explores key aspects of implementation, including agile development methodologies, data governance, role-based access controls, and performance optimization techniques. Case studies across healthcare, retail, and finance demonstrate the platform’s practical impact, highlighting efficiency gains, improved reporting accuracy, real-time analytics, and enhanced regulatory compliance. Additionally, the article addresses common challenges such as integration complexity, technical skill requirements, and governance concerns, providing recommendations for mitigation. Emerging trends such as AI-driven analytics, predictive modeling, real-time streaming data, and cloud-native architectures are analyzed, illustrating the evolving role of Power BI in enabling intelligent decision-support systems. The review emphasizes the strategic advantages of Power BI, including democratization of analytics, scalability, and adaptability to diverse organizational requirements. By synthesizing current practices, technological capabilities, and future innovations, this article provides a roadmap for leveraging Power BI effectively to drive operational efficiency, data-driven decision-making, and organizational agility in dynamic business environments.
DOI: https://doi.org/10.5281/zenodo.17275705
From Spreadsheets To Stories: Creating Actionable Insights With Tableau And The Business Intelligence Lifecycle
Authors: Rani Kumari
Abstract: The transition from traditional reporting methods to interactive, data-driven dashboards has transformed how organizations interpret and act upon information. This review examines the role of Tableau in the Business Intelligence (BI) lifecycle, focusing on its ability to convert raw data into actionable insights that support both strategic and operational decision-making. Tableau’s integration capabilities, including connections to diverse data sources and support for live or extracted datasets, enable organizations to streamline data preparation, cleansing, and transformation. Its visual analytics and interactive dashboards allow stakeholders to explore trends, perform what-if analyses, and monitor key performance indicators (KPIs) in real time. Advanced features, such as calculated fields, predictive modeling, and integration with AI/ML frameworks, enhance the depth and accuracy of insights, while collaborative and cloud-enabled solutions facilitate enterprise-wide adoption. Case studies from retail, healthcare, and finance illustrate Tableau’s practical impact in improving operational efficiency, forecasting, and decision support. The review also addresses challenges, including data quality management, user adoption barriers, and performance scaling, highlighting best practices to overcome these limitations. Looking forward, the integration of AI-driven analytics, real-time data streams, and embedded BI promises to expand Tableau’s influence in decision-making workflows. By adopting Tableau strategically, organizations can foster a culture of data literacy, enhance agility, and ensure that insights are actionable, timely, and aligned with business objectives. Overall, Tableau represents a bridge between complex datasets and operational intelligence, providing organizations with a robust, flexible, and scalable BI platform.
DOI: https://doi.org/10.5281/zenodo.17275764
Adaptive Web Interfaces Through Hybrid Server-Client Architecture: Leveraging ASP.NET MVC And React For Context-Aware UI
Authors: Hema Latha Boddupally
Abstract: Adaptive User Interfaces (AUIs) have become increasingly essential as modern software applications are expected to deliver seamless, intuitive, and personalized experiences across a broad spectrum of devices, screen dimensions, accessibility needs, and interaction contexts. With users frequently transitioning between desktops, tablets, mobile phones, and other emerging platforms, traditional fixed or solely responsive design approaches often fall short in addressing deeper adaptive requirements such as behavior-driven adjustments, contextual awareness, and user-specific personalization. This paper presents a hybrid model for AUI development that integrates ASP.NET MVC’s robust server-side rendering pipeline with React flexible, component-based client-side architecture, enabling interfaces that not only adapt visually but also evolve functionally based on user roles, preferences, device capabilities, and real-time interaction patterns. By leveraging server-side logic for initial content shaping and client-side React components for dynamic rendering and state-driven updates, the proposed model supports fine-grained adaptation, modular UI evolution, and scalable interface personalization. Building on established concepts in responsive design, adaptive graphical interfaces, and context-aware interaction models, the study outlines key architectural strategies, design principles, and implementation techniques that facilitate the development of maintainable, high-performance AUI systems. Furthermore, the paper examines practical challenges such as context modeling, synchronization between server and client layers, managing diverse user scenarios, and optimizing rendering performance, ultimately demonstrating how the synergy of MVC and React provides a powerful foundation for creating intelligent, user-centered adaptive interfaces capable of meeting the demands of modern digital ecosystems.
Identity-Aware Network Segmentation Using NSX And Next-Generation Firewalls
Authors: Naveen Reddy Burramukku
Abstract: Modern enterprise networks are increasingly dynamic, driven by virtualization, cloud adoption, and the proliferation of distributed workloads. Traditional network segmentation approaches, which rely primarily on IP addresses, VLANs, and perimeter-based firewalls, are no longer sufficient to protect against sophisticated cyber threats, particularly those involving lateral movement within the data center. As attackers increasingly exploit compromised credentials and trusted internal access, there is a growing need for security models that are both granular and identity-centric.This research explores the concept of identity-aware network segmentation by integrating VMware NSX microsegmentation with Next-Generation Firewalls (NGFWs) to enforce security policies based on user, application, and workload identities rather than static network parameters. The proposed approach aligns with Zero Trust principles by assuming no implicit trust within the network and enforcing continuous verification of identity and context for every communication flow.The study presents an architectural framework that combines NSX’s distributed firewall capabilities with advanced NGFW features such as deep packet inspection, application identification, and user-based policy enforcement. A controlled virtual testbed is used to evaluate the effectiveness of the proposed model in mitigating east-west traffic threats, reducing attack surfaces, and limiting lateral movement within a virtualized data center environment. Performance impacts, scalability considerations, and operational complexity are also assessed to determine the feasibility of large-scale deployment.Results indicate that identity-aware segmentation significantly enhances internal network security by enabling fine-grained, context-aware policy enforcement without introducing substantial performance degradation. The integration of NSX and NGFW technologies provides improved visibility, simplified policy management, and stronger alignment with modern Zero Trust architectures. This research contributes to the growing body of work on software-defined security by demonstrating how identity-driven controls can be practically implemented to strengthen enterprise network defenses in hybrid and cloud-based environments.
High-Efficiency Power Conversion For Global Smart Grid Infrastructures
Authors: Pratikbhai Patel
Abstract: The globalization of power systems with smart grid infrastructure has increased the demand of using high efficiency power conversion technologies that can incorporate into the national grids renewable energy sources, including solar and wind. This study focuses on the significance of a developed Pulse Width Modulation (PWM) methodology in streamlining the inverter efficiency, minimizing the harmonic distortion, and the grid stability. The research uses a systematic analysis methodology to assess converter topologies, modulation schemes, energy storage integration, electric vehicle mechanism, demand response scheme, and computational intelligence scheme under smart grid conditions. The results show that the optimised PWM methods have a significant drop in switching and conduction losses, thermal performance, and quality of voltage waveforms. These enhancements lead to the growth in the level of renewable penetration, system reliability, and low cost of operation. In addition, the study emphasizes the need to coordinate inverter efficiency standards, adaptive control, and digital grid coordination to facilitate sustainable development goals in the world. These findings validate that power electronic conversion systems that operate on high efficiency are critical enablers of resilience, scale and environmentally sustainable smart grid infrastructures globally.
IJSRET Volume 3 Issue 4, July-2017
UncategorizedEfficient and Scalable Multiple Class Classification: A Review[76-79]
Author: Tarun Yadav
An Approach to Detect Malicious URL through Selective Classification[80-84]
Author: Ghanshyam Sen, Himanshu Yadav, Anurag Jain
Use of LC-Filters to Protect Equipment from Electromagnetic Pulse: Is it Real Necessity or “Bisiness as Usual”?[85-89]
Author: Vladimir Gurevich
Pilot Tone based Winner Filtering Approach for Carrier Channel Offset Estimation in OFDM Systems[90-95]
Author: Shweta Sharma, Minal Saxena
Node Replacement and Alternate Path based Energy Efficient Routing Protocol for MANET[96-101]
Author: Nehalastami, Hansa Acharya
IJSRET Volume 3 Issue 3, May-2017
UncategorizedAuthor: Neelam Kushwaha, Dharmendra Kumar Singh
Encrypted and Unencrypted Computation for Abstract Machine [59-62]
Author: Thripthi.P.Balakrishnan, Mr. S.Vijayanand, Dr. T. Senthil Prakash
Software Defined Visibility using REST API [63-66]
Author: Sindhu T, Shilpa Biradar
Battery Power Aware LAR Protocol for Mobile Ad-Hoc Network [67-69]
Author: Tarun Yadav
LAR Protocol over Mobile Ad-Hoc Network: Survey [70-75]
Author: Jasmeen Akhter, Akhilesh Shukla
Resilient Hybrid Middleware Frameworks: Automating Tomcat, JBoss, And WebSphere Governance Across Unix/Linux Enterprise Infrastructures
Authors: Sambasiva Rao Madamanchi
Abstract: This article examines the orchestration of Tomcat, JBoss, and WebSphere across hybrid Unix/Linux infrastructures, emphasizing the role of automation in creating resilient middleware frameworks. It explores the challenges of heterogeneous environments, including fragmentation, manual administration, compliance demands, and security risks. The discussion highlights how governance can be embedded into middleware provisioning, patching, and monitoring through tools such as Puppet, Chef, Ansible, Nagios, Zabbix, and Tripwire. Case studies from finance, telecommunications, healthcare, and government demonstrate how automated middleware governance improves resilience, compliance, and efficiency in mission-critical operations. The article also addresses challenges such as configuration drift, cultural resistance, and scalability while offering best practices for mitigation. Looking ahead, it identifies future directions including AI-driven predictive monitoring, cloud-native middleware orchestration, continuous compliance, and sustainability integra This article examines the transformation of enterprise governance from a compliance-centric model to a cognition-driven framework enabled by artificial intelligence. It highlights the limitations of traditional governance approaches, which focus on static audits and regulatory adherence, and explores how AI can enhance oversight by providing real-time anomaly detection, predictive compliance, and dynamic policy enforcement. Linux and Solaris, long recognized for their robust security and auditing capabilities, are positioned as ideal platforms for building AI-augmented governance systems that integrate intelligence, automation, and transparency. The discussion presents an architectural blueprint that layers AI engines and automation workflows over existing governance features, ensuring adaptability across hybrid IT environments. Challenges such as AI bias, ethical considerations, and data privacy are addressed alongside practical applications in finance, healthcare, telecommunications, and the public sector. Beyond technical benefits, the article demonstrates the strategic value of cognitive governance in reducing compliance costs, enhancing resilience, and fostering innovation.
From Compliance To Cognition: Reimagining Enterprise Governance With AI-Augmented Linux And Solaris Frameworks
Authors: Sambasiva Rao Madamanchi
Abstract: This article examines the orchestration of Tomcat, JBoss, and WebSphere across hybrid Unix/Linux infrastructures, emphasizing the role of automation in creating resilient middleware frameworks. It explores the challenges of heterogeneous environments, including fragmentation, manual administration, compliance demands, and security risks. The discussion highlights how governance can be embedded into middleware provisioning, patching, and monitoring through tools such as Puppet, Chef, Ansible, Nagios, Zabbix, and Tripwire. Case studies from finance, telecommunications, healthcare, and government demonstrate how automated middleware governance improves resilience, compliance, and efficiency in mission-critical operations. The article also addresses challenges such as configuration drift, cultural resistance, and scalability while offering best practices for mitigation. Looking ahead, it identifies future directions including AI-driven predictive monitoring, cloud-native middleware orchestration, continuous compliance, and sustainability integra This article examines the transformation of enterprise governance from a compliance-centric model to a cognition-driven framework enabled by artificial intelligence. It highlights the limitations of traditional governance approaches, which focus on static audits and regulatory adherence, and explores how AI can enhance oversight by providing real-time anomaly detection, predictive compliance, and dynamic policy enforcement. Linux and Solaris, long recognized for their robust security and auditing capabilities, are positioned as ideal platforms for building AI-augmented governance systems that integrate intelligence, automation, and transparency. The discussion presents an architectural blueprint that layers AI engines and automation workflows over existing governance features, ensuring adaptability across hybrid IT environments. Challenges such as AI bias, ethical considerations, and data privacy are addressed alongside practical applications in finance, healthcare, telecommunications, and the public sector. Beyond technical benefits, the article demonstrates the strategic value of cognitive governance in reducing compliance costs, enhancing resilience, and fostering innovation.