doctype
stringclasses 1
value | section
stringclasses 4
values | topic
stringclasses 10
values | content
stringclasses 26
values |
---|---|---|---|
White Paper | Abstract | Achieving Complete Automation | As agile models become more prominent in software development, testing teams must shift from slow manual testing to automated validation models that accelerate the time to market. Currently, automation test suites are valid only for some releases, placing greater pressure on testing teams to revamp test suites, so they can keep pace with frequent change requests. To address these challenges, artificial intelligence and machine learning (AI/ML) are emerging as viable alternatives to traditional automation test suites. This paper examines the existing challenges of traditional testing automation. It also discusses five use-cases and solutions to explain how AI/ML can resolve these challenges while providing complete and intelligent automation with little or no human intervention, enabling testing teams to become truly agile. |
White Paper | Abstract | Anaplan Model Testing | Anaplan is a cloud-based platform that can create various business models to meet different organizational planning needs. However, the test strategy for Anaplan varies depending on the application platform, cross-track dependencies and the type of testing. This white paper examines the key best-practices that will help organizations benefit from seamless planning through successful Anaplan testing. |
White Paper | Abstract | Dast Automation For Secure, Swift DevSecOps Cloud Releases | DevSecOps adoption in the cloud goes well beyond merely managing continuous integration and continuous deployment (CI/CD) cycles. Its primary focus is security automation. This white paper examines the barriers organizations face when they begin their DevSecOps journey, and beyond. It highlights one of the crucial stages of security testing known as Dynamic Application Security Testing (DAST). It explores the challenges and advantages of effectively integrating DAST into the CI/ CD pipeline, on-premises and in the cloud. The paper delineates the best practices for DAST tool selection and chain set-up, which assist in shift-left testing and cloud security workflows that offer efficient security validation of deployments with risk- based prompt responses. |
White Paper | Abstract | Data Archival Testing | Today, there is an exponential rise in the amount of data being generated by organizations. This explosion of data increases IT infrastructure needs and has an immense impact on some important business decisions that are dependent on proficient data analytics. These challenges have made data archival extremely important from a data management perspective. Data archival testing is becoming increasingly important for businesses as it helps address these challenges, validate the accuracy and quality of archived data and improve the performance of related applications. The paper is aimed at helping readers better understand the space of data archival testing, its implementation and the associated benefits. |
White Paper | Abstract | Fighting Retail Shrinkage Through Intelligent Analysis and Validation | While striving to achieve greater profitability in a landscape characterized by aggressive competition, retailers keep racking up losses due to retail shrinkage. In this white paper we take a look at the various sources of retail shrinkage and discuss the benefits of establishing effective validation for loss prevention systems. The paper also reveals how loss prevention Quality Assurance (QA) solutions can reduce retail shrinkage. We also outline how the right testing approach and techniques can help avoid loss. |
White Paper | Abstract | Fragmented Test Data Management | Test Data Management (TDM) ensures managing test data requests in an automated way to ensure a high degree of test coverage by providing the right data, in the right quantity, and at the right time, in non-production environments. Automated TDM service facilitates test data management across test environments through a structured approach of data subsetting, cleansing, gold copy creation, data refresh, and sensitive data masking. Typically, a centralized TDM system with well-defined processes is more effectual than the traditional manual or decentralized approach, but in some cases, a decentralized approach is adopted. This paper takes a deeper dive into the considerations for the centralization of TDM processes within enterprise ITs. |
White Paper | Abstract | Microservices Testing Strategies | The ever-changing business needs of the industry necessitate that technologies adopt and align themselves to meet demands and, in the process of doing so, give rise to newer techniques and fundamental methods of architecture in software design. In the context of software design, the evolution of “microservices” is the result of such an activity and its impact percolates down to the teams working on building and testing software in the newer schemes of architecture. This white paper illustrates the challenges that the testing world has to deal with and the effective strategies that can be envisaged to overcome them while testing for applications designed with a microservices architecture. The paper can serve as a guide to anyone who wants an insight into microservices and would like to know more about testing methodologies that can be developed and successfully applied while working within such a landscape. |
White Paper | Abstract | Modernizing Enterprise Systems | As digital technologies, smart wearables and remote monitoring capabilities penetrate healthcare, traditional healthcare companies are unable to keep up with end-user expectations. Under pressure to adopt rapid transformation, these organizations are looking for robust and end-to-end testing procedures. This paper explains various end-to-end testing approaches within the four main modernization techniques for healthcare companies. The analysis presented here acts as a guideline for healthcare leaders to make strategic and informed decisions on how to modernize their systems based on the needs of their end-users. |
White Paper | Abstract | Quantifying Customer Experience For Quality Assurance | Post the pandemic, the new normal situation demands an increased digitalization across all industry sectors. Ensuring top class customer experience became crucial for all digital customer interactions through multiple channels like web, mobile, chatbot, etc. Customer experience is an area in which neither the aesthetics nor the content can be compromised as that will lead to severe negative business impact. This paper explains various automation strategies that can enable QA teams to provide a unified experience to the end customers across multiple channels. The focus is to identify the key attributes of customer experience and suggest metrics that can be used to measure its effectiveness. |
White Paper | Abstract | Scaling Continuous Testing Across the Enterprise | Over the years, organizations have invested significantly in optimizing their testing processes to ensure continuous release of high-quality software. Today, this has become even more important owing to digital transformation. This paper examines some of the critical features of setting up a robust continuous testing practice across the enterprise. It considers the technology and process standpoints and provides guidelines to ensure a successful implementation from inception to delivery of high-quality software. |
White Paper | Conclusion | Achieving Complete Automation | Complete automation, i.e., an end-to-end test automation solution that requires minimal or no human intervention, is the goal of QA organizations. To do this, QA teams should stop viewing automation test suites as static entities and start considering these as dynamic ones with a constant influx of changes and design solutions. New technologies like AI/ML can help QA organizations adopt end- to-end automation models. For instance, AI can drive core testing whether this involves maintaining test scripts, creating automation test suites, optimizing test cases, or converting test cases to automated ones. AI can also help identify components for reusability and self-healing when required, thereby slashing cost, time and effort. As agile and DevOps become a mandate for software development, QA teams must move beyond manual testing and traditional automation strategies towards AI/ML-based testing in order to proactively improve software quality and support self-healing test automation. |
White Paper | Conclusion | Anaplan Model Testing | Anaplan's cloud-based, enterprise-wide and connected platform can help global organizations improve their planning processes across various business sectors. The Anaplan model is a simple, integrated solution that enables informed decision- making along with accelerated and effective planning. The strategic approach is one that leverages domain knowledge, test data management, automation, data validation, and integration testing, to name a few. The Infosys 7-step approach to effective Anaplan testing is based on our extensive experience in implementing Anaplan programs. It helps testing teams benefit from the best strategy for QA across various business functions, thereby ensuring smooth business operations. |
White Paper | Conclusion | Dast Automation For Secure, Swift DevSecOps Cloud Releases | DevOps is becoming a reality much faster than we anticipate. However, there should be no compromise on security testing to avoid delayed deployments and the risk of releasing software with security vulnerabilities. Successful DevSecOps requires integrating security at every stage of DevOps, enabling DevOps teams on security characteristics, enhancing the partnership between DevOps teams and security SMEs, automating security testing to the extent possible, and shift-left security for early feedback. By leveraging the best practices recommended in this paper, organizations can achieve a more secure and faster release by as much as 15%, both on-premises and in the cloud. |
White Paper | Conclusion | Data Archival Testing | Due to the critical business needs for data retention, regulatory and compliance requirements and a cost effective way to access archived data, many businesses have started realizing and adopting data archival testing. Therefore an organization's comprehensive test strategy needs to include a data archival test strategy which facilitates smooth business operations, ensures fulfillment of all data requirements, maintains data quality and reduces infrastructure costs. |
White Paper | Conclusion | Microservices Testing Strategies | Improvements in software architecture has led to fundamental changes in the way applications are designed and tested. Teams working on testing applications that are developed in the microservices architecture need to educate themselves on the behavior of such services, as well as stay informed of the latest tools and strategies that can help deal with the challenges they could potentially encounter. Furthermore, there should be a clear consensus on the test strategy and approach to testing. A consumer-driven contract approach is suggested as it is a better way to mitigate risk when services are exposed to an assorted and disparate set of consumers and as it further helps the provider in dealing with changes without impacting the consumer. Ensuring that the required amount of testing is focused at the correct time, with the most suitable tools, would ensure that organizations are able to deal with testing in such an environment and meet the demands of the customer. |
White Paper | Conclusion | Modernizing Enterprise Systems | Disruptive technologies are creating avenues for healthcare providers to issue virtual treatment, advice and services. However, this requires some degree of IT modernization for which end-to-end testing is crucial. There are various approaches that can be used to enable re-engineering, replacing, re- fronting, and re-platforming modernization techniques. Each testing approach has its benefits and risks and must be chosen based on the end-user expectations. Thus, it is important for business leaders to be aware of these in order to make the right decision for their IT modernization journey. The right approach can offer significant cost advantages, accelerate time-to-market and ensure seamless end-user experience. |
White Paper | Introduction | Achieving Complete Automation | Industry reports reveal that many enterprise initiatives aiming to completely automate quality assurance (QA) fail due to various reasons resulting in low motivation to adopt automation. It is, in fact, the challenges involved in automating QA that have prevented its evolution into a complete automation model. Despite the challenges, automation continues to be a popular initiative in today's digital world. Testing communities agree that a majority of validation processes are repetitive. While traditional automation typically checks whether things work as they are supposed to, the advent of new technologies like artificial intelligence and machine learning (AI/ML) can support the evolution of QA into a completely automated model that requires minimal or no human intervention. |
White Paper | Introduction | Data Archival Testing | One of the most important aspects of managing a business today is managing its data growth. On a daily basis, the cost of data management outpaces the data storage costs for most organizations. Operational analytics and business intelligence reporting usually require active operational data. Data that does not have any current requirement or usage, known as inactive data, can be archived to a safe and secure storage. Data archiving becomes important for companies who want to manage their data growth, without compromising on the quality of data that resides in their production systems. Many CIOs and CTOs are reworking on their data retention policies and their data archival and data retrieval strategies because of an increased demand for data storage, reduced application performance and the need to be compliant with the ever changing legislations and regulations. |
White Paper | Introduction | Microservices Testing Strategies | Microservices attempt to streamline the software architecture of an application by breaking it down into smaller units surrounding the business needs of the application. The benefits that are expected out of doing so include creating systems that are more resilient, easily scalable, flexible, and can be quickly and independently developed by individual sets of smaller teams. Formulating an effective testing strategy for such a system is a daunting task. A combination of testing methods along with tools and frameworks that can provide support at every layer of testing is key; as is a good knowledge of how to go about testing at each stage of the test life cycle. More often than not, the traditional methods of testing have proven to be ineffective in an agile world where changes are dynamic. The inclusion of independent micro-units that have to be thoroughly tested before their integration into the larger application only increases the complexity in testing. The risk of failure and the cost of correction, post the integration of the services, is immense. Hence, there is a compelling need to have a successful test strategy in place for testing applications designed with such an architecture. |
White Paper | Introduction | Modernizing Enterprise Systems | Sustainability in healthcare is a looming challenge, particularly as the fusion of disruptive innovations such as digitization, Internet-of-Things and smart wearables enable remote and real-time health tracking, diagnosis and management. To succeed in such an environment, healthcare organizations rely heavily on IT. Thus, using the latest end-to-end testing approaches becomes essential to: • Ensure that all applications operate as a single entity with multi-module interactions • Maintain performance/non-functional scenarios within the desired limit • Identify bottlenecks and dependencies ahead of time so that the business can take appropriate actions. |
White Paper | MainContent | Achieving Complete Automation | Pain points of Complete Automation:Frequent requirement changes hinder complete automation, as manual updates to automation test suites become increasingly complicated over time.Traditional automation often involves frequent manual script-writing, limiting its effectiveness.Difficulty in identifying and utilizing reusable assets within test suites.Scarcity of software development engineers with the right mix of technical skills and QA mindset.How can enterprises achieve complete automation in testing?The key lies in leveraging AI/ML as an automation lever instead of relegating it to scripting.Optimizing manual test suites using clustering approaches reduces effort and duplication.Natural language processing (NLP) transforms manual test cases into ready-to-execute automation scripts, identifying reusable components.Intelligent DevOps combines analytics and automation to enhance the automation process.Deep learning algorithms identify critical paths in the application, improving test coverage.Self-healing solutions automate the maintenance of the automation test suite, adapting to changes in the application.These use cases and solutions provide practical steps for organizations to enhance their test suite automation process, regardless of their test maturity level. |
White Paper | MainContent | Anaplan Model Testing | What is Anaplan? Anaplan is a cloud-based operational planning and business performance platform that allows organizations to analyze, model, plan, forecast, and report on business performance. Once an enterprise customer uploads data into Anaplan cloud, business users can instantly organize and analyze disparate sets of enterprise data across different business areas such as finance, human resources, sales, forecasting, etc. The Anaplan platform provides users with a familiar Excel-style functionality that they can use to make data-driven decisions, which otherwise would require a data expert. Anaplan also includes modules for workforce planning, quota planning, commission calculation, project planning, demand planning, budgeting, forecasting, financial consolidation, and profitability modeling.Sales Forecast Executive Compensation Planning Sales Ops Executive Anaplan Data Hub Shared Data sets from various sources Sales Target Planning Regional Goal Planners Budgeting Finance Mgr. Region-wise Sales Planning Regional Sales Mgr.7 best-practices for efficient Anaplan testing:Understand the domain: QA personnel must adopt the perspective of end-users to certify the quality of the Anaplan model.Track Anaplan data entry points: QA teams should understand the various up-stream systems from where data gets extracted, transformed, and loaded into the Anaplan models.Ensure quality test data management: Verify the accuracy of data being ingested by the models and dedicate a complete sprint/cycle to test its accuracy.Validate the cornerstones: Optimize test quality to cover cornerstone business scenarios that may be missed earlier.Use automation tools for faster validation: Explore options for automating Anaplan model testing to minimize delays during sprint testing.Ensure thorough data validation: Verify the accuracy and integrity of data between the database and data hub and when loading data from the hub into the model.Evolve test strategy with agile methodologies: Incorporate automation in the test strategy to keep pace with shorter cycle times without compromising test quality.Anaplan provides a scalable and flexible platform for enterprise-level planning and decision-making. Testing Anaplan models requires understanding the domain, data entry points, and implementing effective test data management and validation practices. By following the best practices mentioned above, organizations can optimize their Anaplan testing efforts and ensure accurate and reliable business performance analysis and planning. |
White Paper | MainContent | Dast Automation For Secure, Swift DevSecOps Cloud Releases | Background: Traditional security practices are incongruous with the speed of DevOps, leading to a security and DevOps imbalance. The industry's shift to cloud transformation further raises concerns about security. The global average cost of a data breach is estimated at $4.35 million, emphasizing the importance of security in cloud migration planning. DevSecOps, the integration of security into DevOps, is crucial for securely accelerating application releases. However, organizations face challenges, including technical complexity, talent shortage, and lack of automation. Resolving these challenges requires a strategic approach that focuses on people, processes, and technology.Solution: Implementing DevSecOps requires integrating different types of security testing at each stage of the software development life cycle (SDLC). This ensures security is prioritized and automated throughout the CI/CD pipeline. Dynamic Application Security Testing (DAST) plays a crucial role in identifying vulnerabilities during application runtime. Integrating DAST into the CI/CD pipeline enables quick and effective security testing. Best practices for integrating DAST include provisioning essential compute resources, leveraging spidering techniques, creating separate jobs for different tests, and onboarding SMEs with security expertise.Tool Selection: Selecting the right security testing tools is crucial for effective DevSecOps implementation. Common challenges include lack of standards, tool complexity, inadequate training, and configuration challenges. Best practices for tool selection include considering tool standards, documentation, performance, cloud technology support, customization capabilities, and continuous vulnerability assessment. Implementing tools effectively involves customizing rules, incremental scans, leveraging AI capabilities, aiming for zero-touch automation, and enforcing security standards through automated gating.Integration: Integrating DAST in the CI/CD pipeline requires setting up a pre-production or staging environment and deploying the web application. Key considerations for integrating DAST using tools like ZAP include testing on the developer machine, setting up the CI/CD server and GitLab, reusing functional automation scripts, and customizing scripts for security testing. The integration process ensures that DAST runs smoothly in the pipeline, providing early feedback on application security. |
White Paper | MainContent | Data Archival Testing | Data Archival Testing - Test Planning:Data Archival is the process of moving non-operational, non-analytical, or non-reporting data to offline storage. Data archival testing helps address challenges such as inaccurate or irrelevant data and difficulties in the retrieval process. When devising a data archival test plan, factors like data dependencies, data encoding, and data retrieval should be considered. Data archival testing ensures the integrity of archived data, validates storage mechanisms, verifies hardware independence, tests data deletion processes, and ensures easy identification and retrieval of archived data. The benefits of data archival testing include reducing IT infrastructure costs, ensuring compliance, optimizing performance, and facilitating efficient data retrieval. |
White Paper | MainContent | Microservices Testing Strategies | Microservices architecture presents several challenges in testing. The definition of microservices can vary, but they are characterized by being self-contained, decentralized, resilient, and built around a single business need. Testing microservices poses difficulties due to their distributed and independent nature. Testing teams may rely on Web API testing tools built for SOA, but this can be ineffective since timely availability of all services for testing is not guaranteed. Additionally, individual services are expected to be independent, making data verification and extraction of logs complex. A good test strategy should consider the right amount of testing required at each point in the test life cycle and leverage unit testing, contract testing, integration testing, and end-to-end testing approaches. |
White Paper | MainContent | Modernizing Enterprise Systems | Testing challenges in healthcare modernization Business transformation in healthcare is a complex endeavor, primarily due to the challenges in maintaining the integrity between different types of customer needs and health-related plans. Modernizing healthcare software applications mandates enabling a multi-directional flow of information across multiple systems, which can complicate the entire healthcare workflow. Furthermore, failures or errors in systems outside the enterprise environment can adversely affect the performance of applications with which they are integrated. To address such challenges, it is crucial to determine the right methods and types of end-to-end testing. This will optimize application performance by thoroughly testing it across all layers, from the front-end to the back-end, along with its interfaces and endpoints.Typically, most healthcare organizations employ multi-tier structures with multiple end-users, making end-to-end testing a very complex process. Launching a new product or implementing changes in such a multi-directional business scenario necessitates extensive user testing. Therefore, to enable effective end-to-end (E2E) testing, health insurance companies must first understand what customers expect from their healthcare providers and identify how they can meet these expectations within shorter timelines.In order to achieve successful healthcare modernization through end-to-end testing, Infosys leverages four modernization techniques: re-engineering, replacing or retiring, re-fronting, and re-platforming. Each technique requires specific testing approaches tailored to the unique requirements of the modernization process.Re-engineering technique: This approach is best-suited in cases where companies need to digitize healthcare product marketing to different constituents of a state through online retail. It involves simulating user-centric interactions, conducting compliance testing for security protocols and financial boundaries, and implementing blend testing to combine functional and structural testing into a single approach.Replacing or retiring technique: This technique is employed when there is a need to remove contract and legal documentation from healthcare insurance and hospitals to a separate online portal. It focuses on plug-and-play testing, web service-based testing, neutrality testing, parallel testing, assembly testing, and usability testing.Re-fronting technique: This approach is utilized when adding encryption logic protocol is required for sensitive claim-related information passing through a web service. It encompasses virtualization testing to simulate multiple users, non-functional testing to ensure platform integrity, and regression testing to verify system behavior.Re-platforming technique: This technique is implemented when upgrading billing/payments databases to recent versions is needed due to license renewals. It involves migration testing to ensure data integrity during technology upgrades, acceptance testing to maintain application recognition intensities, intrusive testing to analyze system behavior under unexpected variables, and volume testing to evaluate application stability under heavy load.Comparative analysis of various testing approaches is crucial to determine the most suitable method for each modernization technique. |