The Benefits of FMEA & QMS Software in your Organization

FMEA studies can require a huge effort. For complex processes or product designs with a multitude of systems and subsystems, you can easily end up with hundreds or thousands of failure modes.

Enhancing your Quality Management System (QMS) software with an FMEA application will enable you to store and manage risk-related information, and easily relate risk reduction to quality events for maximum benefits. In addition, you can:

  • Streamline data collection, risk assessment and reporting across sites and facilities
  • Eliminate duplicate data, improve sharing and communication, and significantly reduce study time
  • Deploy active notifications based on built-in risk reduction policies and threshold limits
  • Reducing costs through improved risk management processes, including more effectively identifying and addressing failure modes
  • Provide real-time access to corporate quality performance metrics
  • Create control plans and control plan templates to follow through

In Intelex’s latest video, “Effectively Mitigate Risk with Intelex’s FMEA Software!” we show you … Read more...

Demonstrate Risk-Based Thinking to Auditors

The auditor is coming, or will be coming, to evaluate your organization. While continuous improvement and compliance helps, there are things that you can do to make the auditing process go much more smoothly (this applies to internal audits as well). Preparation doesn’t start the week before, it starts the minute you implement a quality management system or prepare for ISO 9001 certification. You need a systematic approach to audit reporting that begins with core management principles, training, traceability, and credibility.

Because risk-based thinking is more prominent in ISO 9001:2015 than in previous versions of the standard, many organizations are wondering how to demonstrate how they do it to auditors. Fortunately, most activities in the domain of quality management, if successful, serve to reduce risks. The key is to keep track of how your efforts relate to risk.

Here are five actionable recommendations to demonstrate risk-based thinking to auditors :… Read more...

Building your Organizational Culture of Quality with a QMS

A strategic culture of quality, one in which every stakeholder shares the pride, passion, and initiative to deliver the highest quality products and services, should be built on a quality management software solution (QMS) that integrates an organization’s people, processes, and tools. The world’s leading quality organizations, such as Toyota, know that implementing a QMS that creates an environment in which all employees can thrive, and which incorporates data-driven decision-making using statistical methods and continuous improvement, will be foundational to innovation and success.

Many industries face fundamental challenges in building a culture of quality. Construction projects often feature diverse teams of architects, designers, engineers, and builders who come together temporarily and bring with them their own unique perspectives on quality culture. In healthcare, rigid hierarchies can hinder effective communication and lead to more frequent instances of infection and negative patient outcomes. In hospitality, tight schedules and difficult physical conditions can … Read more...

How Risk-Based Thinking Can Have a Significant Impact on Brand Equity

By Nicole Radziwill & Sonduren Fanarredha

For an organization to deliver high-quality products and services consistently, it must be able to create and sustain long-term value. An organization’s brand therefore consists not only of its name, but also its logo, its overall image and how it is perceived. “Brand equity” is the additional value a brand acquires because of its reputation or prestige in the marketplace. Brand equity takes time to build and, since it can have an impact on buying decisions over time, it is a significant part of an organization’s brand recognition and value. Losing this equity because of brand damage can also have far-reaching negative consequences.

As powerful as it can be, brand equity is also fragile. There are many forces that can threaten it, including:

  • Industry environments that are more uncertain and competitive.
  • Consumers that are increasingly empowered and have a stronger idea of what they
Read more...

What’s Your Quality 4.0 Strategy?

In ISO 9001:2015, quality is the “degree to which a set of inherent characteristics of an object fulfils requirements.” (3.6.2) Quality 4.0 describes the technological innovations that will help us more quickly assess compliance and customer satisfaction and optimize business processes through systems integration — whether the object we’re working with is a process, a product, a person or an intelligent software system.

Quality 4.0 systems are:

  • Connected— electronic, networked, and capable of communicating in real time with people and systems.
  • Intelligent— autonomous, reactive, proactive, social and/or adaptive to new data or new environmental conditions.
  • Automated — able to carry out instructions with or without human participation.

As a result, Quality 4.0 strategies emphasize real-time visibility, intelligent decision support, and improved communication — between people, systems and machines.

For example, Nikon’s recently announced Quality 4.0 strategy focuses on real-time measurement: improving and automating measurement systems, automating inspections and … Read more...

Your Data is Your Most Valuable Asset: Getting Started with Quality 4.0

Data science and machine learning have surged in prominence over the past few years, and digital transformation seems to be on everyone’s agenda. Have you ever wondered why? Even though quality engineering has long been a data-driven pursuit, we now have the potential to get even deeper insights from our data because of several recent innovations:

  1. Computing power per dollar has increased steadily (e.g. through adoption of GPUs).
  2. Open-source software packages with powerful machine learning algorithms are freely available, reliable, robust, and well-maintained.
  3. Infrastructure for data storage and management is readily available and cost-effective.
  4. Cloud-based software, platforms, and infrastructure helps companies focus on their core competencies and scale rapidly when needed.
  5. Algorithms are often more revealing when Big Data is available.

It’s easy and cheap to collect data but using it to generate actionable insights can be more challenging. Think, for example, about the digital displays available to a production … Read more...

The New Partnership-Based Landscape of QMS Validation

Many organizations across countless industries are turning to the Cloud Computing model to validate their Quality Management Systems. With this shift comes many changes, including a new emphasis on partnerships. Let’s look at this new partnership-based landscape.

Three’s company

With more than two parties (traditionally the customer and the software provider) now involved in the validation process (the cloud provider has now added to the mix), the concept of the typical Service Level Agreement (SLA) changes in the cloud. It’s important, says Ray Glemser, CEO of IT solutions and services provider Glemser Technologies Corp., for a customer to recognize they are now in partnership with the other two players, who are performing more specialized functions for them.

“These are tied together in what we used to call an SLA,” Glemser says. “We are now organizing them into quality agreements where the quality system validation state needs to be maintained … Read more...

Quality Management Tools for Enabling Customer Relationships

In February 2002, the United States Secretary of Defense, Donald Rumsfeld, uttered the following infamous phrase:

“There are known-knowns; there are things we know we know. We also know there are known-unknows, that is to say we know there are some things we do not know. But there are also unknown-unknowns, the ones we don’t know we don’t know.”

Slovenian philosopher Slavoj Žižek’s clever rejoinder fills in the obvious missing element and demonstrates the secret wisdom of Rumsfeld’s analysis:

“…What he forgot to add was the crucial fourth term: the ‘unknown knowns,’ the things that we don’t know that we know.”

When it comes to knowing what customers want, we could learn a lot from Rumsfeld and Žižek. Sometimes customers know what they want and how to articulate it; sometimes they know what they want but not how to articulate it. Even more difficult to understand is when customers don’t … Read more...

How Software Provider Innovation Is Driving Change in QMS Validation

Important changes are taking place in the way many software providers develop their products, ones that have direct impacts on the validation process.

Waterfall development vs. Agile development

For many years, developers have used the “waterfall” model to create their software. This involves creating a fully developed version of the software before presenting it to customers or internal customer representatives for testing and feedback. Each step – plan, analyze, design, construct, test, deploy and maintain – is done one after the other, forming a waterfall visual.

Here, all user requirements are defined before anything is developed. A major drawback to the Waterfall model is that it is often impossible to know all user requirements at the outset of a software development project. Those that are identified often change during the months-long development cycle. Others do not become evident to users until the completed version is put before them. Developers … Read more...

Changing Roles in the QMS Cloud Validation Model

Many organizations across countless industries are turning to the Cloud Computing model to validate their Quality Management Systems. With the shift comes a change in who performs what roles in the validation process. Let’s look at this new responsibility landscape.

Who plays what roles

Who does what in the Validation model can change when we move to the Cloud. The customer remains responsible for, on the front end, validation planning and user requirements, and on the back end, user acceptance testing and validation reporting.

The software provider can assume responsibility for system requirements, detailed design, system configuration and development, unit/integration testing and system testing. The cloud provider can now look after the technical architecture and IQ.

Change control – private cloud vs. public cloud

Life sciences companies can tap into a wealth of innovative software that is being developed at a rapid rate – but it comes at a … Read more...