5 Dimensions of Digital Trust for Decision-Makers
by Nesirat Pub

5 Dimensions of Digital Trust for Decision-Makers

No alt text provided for this image
by Nesirat Pub

The 5 Dimensions of Digital Trust

I am delighted to welcome you to Leadership by Design, a weekly newsletter in which I provide actionable insights that can help readers think differently about things they struggle with today. 

If you'd like to see more of my posts and articles, follow me on LinkedIn.


No alt text provided for this image
by Nesirat Pub


Every individual wants to know that the applications they use on a daily basis are safe for them to use. I cannot count on one hand the number of times I have heard people express their skepticism about a new or existing application. This is not only because it is our natural tendency as human beings to want our personal information to remain private, including everything from medical records to login credentials. Also, due to the rampant sale of consumer data by the technology industry.

Even though the practice is not new, it is alarming at the rate and scale at which it is carried out. However, users are becoming increasingly aware of their privacy rights and have taken extra precautions to protect their digital rights. They are also better informed about where to voice their concerns in order to have the greatest impact.

A recent speech by Philippe Dufresne, Canada's Privacy Commissioner, emphasized the importance of data privacy in this regard, "Privacy is a fundamental right, a precondition for citizens' other freedoms as well as a keystone right for democracy and personal and social development....".

Therefore, it is critically important that developers and advocates of digital technologies realize that valuing consumer privacy, data use and inclusion may quickly become a de facto norm. And companies must take digital trust seriously if they are to survive in the increasingly competitive market. 

So, let's look closely at how decision-makers can achieve this

Last week's issue of Leadership by Design discussed three of the eight dimensions of digital trust. In today's issue, I will examine the other five dimensions: transparency, interoperability, auditability, redressability, and fairness.

To recap, digital trust is individuals’ expectation that digital technologies and services – and the organizations providing them– will protect all stakeholders’ interests and uphold societal expectations and values.

According to the World Economic Forum's report on, Earning Digital Trust: Decision-Making for Trustworthy Technologies, trust in digital technologies and services (and in organisation's that provide them) has eroded rapidly. With it, the expectation that these technologies and services will protect the interests of all stakeholders and maintain the expectations and values of society.

In order for companies that design and deploy digital technologies to be more trustworthy, their leaders must adopt a more trusting approach. This is if the current decline in trust is to be reversed. Citizens and users must be considered stakeholders in technological innovation and technological harms should not be thought of as externalities unless they adversely affect the bottom line. In other words, they must choose to make decisions that promote digital trust.

Without CEO and board action and a change in behaviour, trust in technology and innovation will continue to fall.

  1. Transparency

For digital operations and uses to be transparent, honesty and clarity are required. 

When organizations are able to provide visibility into their digital processes, they reduce the information asymmetry between them and their stakeholders. By doing so, they signal to individuals that the organization intends not only to act in their interest, but also to make the organization's actions known and understandable to all within and outside the organization.

How it relates to the digital trust goals?

Inclusive, ethical and responsible use

Transparency allows us to observe how decisions are being made, enabling us to intervene in order to create an inclusive, ethical, and responsible environment.

In situations where organizations recognize that they have an ethical responsibility to share information about how technological systems are used and for what purposes, ensuring transparency is an important element of trust-building.

It is through transparency that we gain insight into how technologies are developed and implemented, how data is used, and how standards are set for governance. In addition, if the mechanisms of accountability and oversight are transparent, they become more trustworthy. Providing stakeholders with insight into how technology decisions are made and how issues related to the development of new technologies are handled also increases trustworthiness for customers, citizens, and other stakeholders.

To impact the trustworthiness of an organization or technology, the specifics of the goal of security and reliability, and the progress made towards achieving them, must be transparent. Trustworthiness can be significantly enhanced even by relatively simple mechanisms for tracking security incidents and reliability failures. Security and reliability mechanisms assist stakeholders in setting expectations and ensuring that these goals are taken seriously by the organizations to which their data or physical safety is entrusted.

Implementation

The design should be user-friendly and transparent.

Leaders should encourage their teams to work backwards. Determine what details may need to be disclosed in the future. When developing an organization's technology stack, document design decisions can include capabilities that enable tracking the use and flow of data to be communicated, as necessary, to a range of stakeholders in a timely and useful manner (both internally and externally) as well as in the products provided by the organization to external users.

2. Interoperability

Interoperability is the ability of information systems to connect and exchange information for mutual use without undue burden or restriction.

How this relates to the digital trust goals? 

Inclusive, ethical and responsible use

An organization's ethical and responsible use goals must also be considered in order to achieve interoperability. In order to achieve this, it may be necessary to strike a balance between interoperability on a large scale and the organization's commitment to ethical and responsible use. Therefore, the degree of technology interoperability must be determined by senior leaders and should not be viewed as a purely technical issue.

Similarly, interoperability may promote inclusivity by facilitating access to technology for a wider range of stakeholders (for example, the portability of health data). Nevertheless, the benefits and risks of these interconnections must be weighed in light of the organization's goals and the expectations of its stakeholders.

Accountability and oversight

Interoperability facilitates collaboration and improvement of technology among many individuals and organizations. With such a large number of collaborators, there is an opportunity for additional oversight, but it also necessitates the development of additional accountability mechanisms within each organization. 

In cases where collaboration promotes and facilitates group problem-solving, this will involve inputs that should be considered by the accountability functions within the individual collaborating organizations. Every organization that develops interoperable technologies has the responsibility of ensuring that its accountability and oversight mechanisms meet the standards of the whole system and the expectations of all stakeholders.

A significant contribution to technology security and reliability is made by interoperability requirements and controls. For technology to co-exist and connect with other technologies and data, a degree of openness – including open-source code and common data standards – is necessary, even if not in itself sufficient, to enable sharing and integration. 

In addition, when the source code is accessible and public, users can help to verify that the technology functions as intended and identify the dependencies of their safeguards on other technologies and organizations. Despite the fact that source code cannot be made public, adequate assurances of security and reliability can promote interoperability between systems, which is a result of digital trust and promotes greater stakeholder trust.

Key considerations for decision-makers

Interoperability should not be regarded merely as a technical issue, but rather as a matter for senior leaders' judgment.

Implementation

Establishing the foundation for interoperability. 

In order for data to be interoperable, it must be interpreted and presented by different systems, while preserving its original context. In order to achieve this, it is necessary to consider the governance and operating rules for the technology. These rules are intended to establish how participants in the interoperability arrangement will make decisions, manage operations jointly, and evaluate risk. A business agreement should also balance the economic interests of the parties and provide incentives for source code and data exchange. It is also important for designers to plan for the technical infrastructure that connects parties, systems, and the data they contain.

Technology standards and expectations must be uniform. 

Standards specific to an industry can contribute to economic growth and social progress. There are many examples of this type in history which has taught us that, competing organizations’ adoption of common standards enabled wider dissemination of critical goods, money and information due to users’ and companies’ independence from specific systems and networks.

3. Auditability

Hot it relates to the digital trust goals?

Inclusive, ethical and responsible use

It is possible for organizations to measure their own progress in relation to their ethical goals by conducting comprehensive audits. The availability of the results can also serve as an indication to individuals and other stakeholders that an organization is meeting its commitments to achieve this objective. Organizations should consider the implications of their technology decisions when considering how to audit them. Digital trust audits must assess whether technologies developed, implemented or used are adequately inclusive of a wide range of potential users and stakeholders (and meeting their expectations), as well as whether they meet the organization's ethical and societal commitments and goals.

Accountability and oversight

Governance, accountability, and oversight are enhanced by audits. Without a robust audit mechanism, an organization cannot adequately achieve this goal. In order for digital technologies (especially emerging technologies such as artificial intelligence) to be accountable and overseen effectively, auditability must be addressed at the conception stage. 

The auditability of a technology, data processing, and governance process refers to the ability of an organization and third parties to examine and verify its actions and results. 

An organization's auditability serves as a check on the commitments it has made and signals its intention to fulfill those commitments.

Organizations with a high level of trust avoid developing or implementing technologies that operate in a "black box", which makes it impossible to examine their operation and performance.

It is through auditability that we can adjust for the otherwise limited means of assessing security and reliability, an opportunity that occurs often only when a serious security or reliability issue arises. The publication of security audits and the operation of bug bounty programs can indicate to trust givers that the organization places a high degree of importance on security and reliability. Reporting security breaches and uptime externally, as well as measures taken to improve these factors, contributes to the building of trust among stakeholders in an organization.

Considerations for decision-makers

Implementation

Defining the scope of an organization’s audit landscape. 

Most organizations are well versed in auditing their quantitative procedures, decisions, and associated data. When seeking to earn digital trust, documenting and applying auditability standards and processes to qualitative procedures and decisions is all the more crucial due to the potential for variability and claims that an organization is not meeting its commitments. As a result, organizations are required to compensate for the potential documentation challenges that may arise in the context of such procedures and decisions.

4. Fairness

Those responsible for technology and data processing at an organization should be aware of the potential for disparate impact, as well as strive to achieve just and equitable outcomes for all stakeholders, given the relevant circumstances and expectations.

How it relates to the digital trust goals?

Inclusive, ethical and responsible use

Fairness is fundamental to achieving the goals of inclusivity, ethical behavior, and responsible use. What constitutes fairness in a given situation is ultimately a subjective judgment. It involves balancing issues such as equity, equality, consistency, and many others. For example, in some scenarios, equality may not be just, and therefore equity considerations may motivate certain individuals or groups to take additional steps to level the playing field.

Decisions such as the distinction between equality and equity are prime examples of the need for standardization. The standardization of this process enhances fairness by ensuring that such decisions are objectively consistent in processes and outcomes – a key hallmark of fairness– and aligned to a common set of ethical, inclusive and responsible use norms defined in best practices frameworks.

Accountability and oversight

Providing a signal of trustworthiness to customers and individuals is a key objective of accountability and oversight activities. In accordance with their values and those of the society in which they operate, organizations should include fairness as an issue to which they hold themselves accountable. There may be different standards of fairness for the same organization in different geographic locations. To achieve this goal, organizations should integrate fairness into their decision-making processes so that questions of "what is fair" or "what is just" are not exogenous.

It is important for organizations to provide opportunities for internal and external validation of whether a decision is fair (as defined consistently within the organization).

An essential component of fairness commitments is the achievement of similar outcomes across similar situations for different individuals. Where fairness is considered “treating similarly situated individuals similarly”, the mechanisms for protecting data and ensuring its availability for use for beneficial purposes must be equally applied. Good security itself is an exercise in promoting fairness. As organizations are the controllers of individuals’ data and receive benefits from using such data, fairness demands that they reciprocate that value by making efforts to protect the data they have received.

Considerations for decision-makers

Implementation

Documenting fairness judgement calls 

The decision regarding fairness generally results in reasonable consistency in treatment of all individuals regardless of any jurisdiction-specific discrimination protections (e.g., fair lending or fair housing). Such decisions may be addressed in an organization's diversity, inclusion, or accessibility initiatives - or through responsible AI efforts. 

Documenting the justification of associated decisions within an organization's technology and data processing will ensure that both the process and outcome are fair. As an example, the trade-off between standardization and personalization may have unfair connotations, as there is often a fine line between appropriate personalization and biased (i.e. discriminatory; exclusionary) experiences.

In making design decisions, it will often be helpful to record the assessment of fairness and equity, since fairness can mean different things to different people in different contexts. This documented process may require an impact assessment that identifies affected stakeholders, potential harms and benefits, and steps that can be taken to mitigate the effects. 

5. Redressability

An individual, group, or entity may seek recourse if they have been negatively affected by technological processes, systems, or data uses. It is necessary for trustworthy organizations to have robust methods for redress when recourse is sought and mechanisms in place to make individuals whole when they have been harmed by unintentional errors or unexpected factors.

How it relates to the digital trust goals?

Inclusive, ethical and responsible use

By providing avenues for redress and establishing processes and cultures to provide redress, trust is built through the maximization of agency. 

Furthermore, this demonstrates an organization's respect for its employees' interests, needs, and expectations. Developing or implementing new technologies requires external opportunities for identifying problems and redressing harms. It is true that responsibility can be achieved through internal measures of accountability, but ethical and responsible organizations create avenues for redress when technology that they develop or control causes harm to external stakeholders. Likewise, these external avenues serve as a check when an organization falls short of the goal of inclusive, ethical, and responsible use.

In order for any accountability or oversight program to be effective, redress mechanisms are essential. A trustworthy organization utilizes its oversight function to ensure it is accountable to itself and all stakeholders for technology-related decisions and the consequences of those decisions rather than focusing solely on improving internal delivery or maximising efficiency or profit. It is essential that organizations take a human-centered approach to making decisions and actively seek opportunities to remedy harms caused by their decisions.

Failures in security and reliability affect the organization and its network of partners, users and other stakeholders. There is a negative impact on trustworthiness when there is a significant period of downtime or a data breach. This often result in a loss of trust. This loss of trust is compounded when these events are accompanied by a lack of redress or a refusal to make affected partners, customers, and individuals whole. When security or reliability cannot be adequately achieved, these stakeholders require a clear, easy-to-use avenue for redress. In this way, harm may be assessed and corrected appropriately.

Considerations for decision-makers

Implementation

Facilitate effective redressability: 

Commitments to redressability may be based on existing procedures. Support functions are likely to exist in organizations for users, customers, or clients. These functions may be layered, starting with automated self-service for frequently asked questions (FAQs), which may be followed by email, telephone or chat support, and then by an agent if necessary. It may be possible to use these functions and tiered processes to ensure that redressability can be achieved effectively and at a minimal cost to the organization. Organizations should also strive to engender trust through transparency and other self-service mechanisms, such as FAQs and designing products and services that promote trust.



No alt text provided for this image
by Nesirat Pub

Share Leadership by Design

Know someone who would benefit from weekly insights on business growth, building capacity, leadership and performance?

Refer them to Leadership by Design.

Please don't forget to like, repost and share your thoughts on the topic.

See you in the next issue.

To view or add a comment, sign in

More articles by Yosef Nesirat

Others also viewed

Explore content categories