📚
BeefHub
  • Welcome
  • 01. What is BeefLedger?
    • What is BeefLedger?
    • What problem is BeefLedger trying to solve?
    • What markets are BeefLedger trying to address?
    • What makes BeefLedger different from other food supply chain projects?
    • What services & products does the platform offer?
    • Who are the BeefLedger Core Team?
    • Who are the BeefLedger Research Team?
    • Who are the BeefLedger Developer Team?
    • Who are the BeefLegends?
    • What partnerships have BeefLedger formed?
    • What Business Partnering Models has BeefLedger implemented?
  • 02. BeefLedger Technology
  • BeefLedger Ecosystem Design
  • Why was a Proof of Authority consensus mechanism chosen?
  • How can someone become a Member on the network?
  • What are Data Cartels and how does BeefLedger solve for them?
  • What is the Multi-sig protocol?
  • What is the Community Attestation Protocol?
  • How is data proposed onto the network?
  • How do Proposal Fees work?
  • What are the roles of Nodes & Oracles on the network?
  • Is BeefLedger considering any other Layer 1 protocol?
  • What dimensions of decentralisation is BeefLedger focused on developing?
  • How does BeefLedger interact with real world objects using IOT solutions?
  • What are BeefLedger Vaults?
  • How does someone use Magic.Link to login into the Network?
  • 03. BeefLedger Tokenomics
    • Tokenised Ecosystem - Overview
    • What is $BEEF (the means of payment token)?
    • What are DAI-Certs (the NFTs)?
    • What is REG05 (the Digi-unit)?
    • What are the $BEEF tokenomics?
    • How does BeefLedger promote responsible token ownership?
    • Where can I purchase $BEEF tokens?
Powered by GitBook
On this page
  • How does BeefLedger solve for them? Project Kratos
  • Multisig & Schelling Points

Was this helpful?

What are Data Cartels and how does BeefLedger solve for them?

PreviousHow can someone become a Member on the network?NextWhat is the Multi-sig protocol?

Last updated 4 years ago

Was this helpful?

Cartels have long been as having deleterious impacts on consumers ever since Stigler’s 1964 classic “A Theory of Oligopoly”. This has typically taken the form of collusion to control production volumes and manipulate pricing.

We have identified a specific category of cartel risk in our work on supply chain optimisation in an era of digitalisation: namely data cartels. Here, we extend the foundational thinking around collusive activities associated with pricing and production of goods and services to fundamentally address the issues of the production, storage and dissemination of data (the data function) itself.

How does BeefLedger solve for them? Project Kratos

To achieve this, BeefLedger has stood up Project Kratos by enlisting the community at large (the “buy-side”, so to speak) to participate and become a fundamental part of the whole-of-ecosystem integrity. In other words, the community can take part in attesting that the data on the network is credible and reliable.

Project Kratos addresses this set of interrelated common or mutual knowledge challenges by embedding the following design principles into the common data ecosystem, underpinned by decentralised consensus protocols:

  1. Data requirement priorities are user-determined.

  2. Consumers have an inalienable interest in the integrity of supply chains, and supply chain data.

  3. All data proposals require a multi-party governance structure and protocol to be valid. No-one can act alone.

  4. The data commons governance – as enabled by the underlying blockchain architecture – should verge towards empowering self-governance and organic adaption, with decision-making capacity vested with network members.

Multisig & Schelling Points

At the heart of the BeefLedger data community is a 2-part set of processes by which data is proposed, validated and published to the blockchain:

  1. A multi-sig procedure so that any data proposal requires a number of actors to share responsibility for proposing and witnessing the information. We apply an organic philosophy here so that self-governing “thresholds of trust” can be determined over time by the community at large. In other words, no one size fits all; and

  2. A whole-of-community attestation procedure (optional) underpinned by the economics of “Schelling” points (see diagram below)). Data validation is a common utility resource, and as a service, is something proposers and the community at large needs to pay for. By rendering explicit the value of data validation, we create novel transparent mechanism that incentivise “truthful convergence”. Incidentally, these mechanisms also effectively valorise reputations, which over time contribute to rich trustfulness for actors that demonstrate informational virtue within the ecosystem.

recognised