What types of cryptographic ledgers does Gossamer support?
Gossamer as a cryptography protocol is agnostic on this detail: It should be possible
to build Gossamer atop almost any cryptographic ledger that doesn't have strange constraints
that prevent Gossamer integration.
Gossamer, as currently implemented, supports Chronicle.
There's an open issue to support
Google's Trillian, if the community wants this.
Why doesn't Gossamer just piggy-back on existing Certificate Transparency tools?
Mozilla has a project called Binary Transparency
that does exactly that, if you're interested.
We decided not to pursue this path for multiple technical reasons:
Protocol Rigidity and Complexity:
The existing Certificate Transparency systems are designed for augmenting the X.509 CA system.
Integrating with the existing systems requires hacks (i.e. Binary Transparency constructs a
domain name for each new binary record) and possibly an intermediary CA certificate.
Additionally, some of the other features (e.g. attestations) are much harder to implement in
Certificate Transparency than they would be in a separate ledger design.
Cryptographic Complexity: Certificate Transparency tethers
any new protocols to supporting RSA in practical perpetuity.
This has security implications beyond just inflating the attack surface:
There are almost no libraries that implement constant-time RSA;
instead, they rely on blinding techniques to prevent side-channel attacks.
(The story for constant-time ECDSA isn't much better, unless you're constrained
to NIST P-256—and even then, ECDSA's design makes it easier
to use insecurely.)
It doesn't; they're totally independent of each other.
Authenticode is the Code Signing feature built into Microsoft Windows. Authenticode
is used to verify the integrity of binaries on Windows devices. Windows accomplishes
this through a Certificate Authority-based trust chain and PKI (which has been successfully
attacked by nation states before, due to legacy cryptography support).
Authenticode certificates cost hundreds of dollars per year.
Gossamer provides code integrity for open source software dependencies, and does so
without Certificate Authorities. The only way to solve this problem without Authorities
is to embrace extreme transparency. Gossamer signing keys
are effectively free (modulo the small energy cost to generate one and transmit bytes).
How would Gossamer help in a scenario like the 2020 SolarWinds supply-chain attacks?
After all, didn't SolarWinds already use code-signing?
Code signing is important for ensuring the software you're installing is authentic,
but it isn't the complete solution. Gossamer goes above and beyond just
digital signatures by including third-party attestations
in its design.
These third-party attestations can range from simply ensuring they can build the
same update package (binary, zip file, etc.) from the source code to simple
spot checks against backdoors to full-blown security audits (a.k.a. penetration
tests).
So let's think through a SolarWinds-style attack, where the build machines and signing
keys get compromised, so the deliverable has a valid signature but contains malware:
With Gossamer, the update will be logged in the ledger before anyone can install it.
This creates an immutable footprint of the attack (and precisely when it took place).
So if an attacker gets this far, they have abandoned stealth.
Beyond that, the attacker would need to further know which third parties their target
uses to vet software updates, and then independently compromise all of those entities.
Without this knowledge, capability, and red team bandwidth at the attacker's disposal,
the systems they're hoping to compromise will never install the update they published
until the trusted third parties have approved it. Security companies and experts
that approve malware do so at the risk of their reputation.
(Aside: Gossamer's design was intended for the open source community, but strictly speaking,
there's nothing that prevents security vendors from vetting and approving closed source
software through agreements with the software provider that furnish them with exclusive
access to the source code.)
Why put all this work into securing {PHP, WordPress} of all things?
A better question is: Why doesn't everyone else in the security industry do this too?
A lot of people ask this question because they hate PHP, or their infosec friends
say PHP is a bad language, or they believe that PHP security is an oxymoron, etc.
We'd like to offer a perspective many of them haven't considered:
No matter how strong your negative feelings about the language are, the PHP ecosystem's
code integrity problem could very easily become your availability problem.
(And if not yours, certainly your clients' or employers'.)
All it would take is someone abusing the software supply chain (i.e. WordPress's update servers)
and they could conscript a significant chunk of webservers on the Internet into their DDoS botnet
army. It's difficult to calculate the total damage from such an attack, but we can estimate it
to be lots.
Shame is a poor mechanism for changing people's minds about the technologies they use.
If infosec managed to collectively boycott the PHP programming language today, it would
not stop most people from writing or deploying it. At the end of the day, our ethical
obligations are towards users first and foremost.
How can we trust a private, for-profit company with this project?
Aren't there perverse incentives at play?
Aside from us, there aren't many other companies willing to put in the work to secure entire
software ecosystems—especially the cryptographic engineering portions. It's difficult,
thankless work that doesn't generate any revenue, and there are so many other things we could
be doing with our time. Thus, we're really the only game in town (especially for PHP);
who else is there to trust?
Understandably, this makes some people nervous (bus factor, suddenly turning evil, etc.).
We encourage the skeptical to independently verify our claims (especially security claims)
and hold us accountable for the promises we make. We do everything (within reason) openly
and with extreme transparency, after all. Keep us honest! Beyond that, there's very little
we can say or do to assuage your concerns.
Regarding incentives: Our company's bread and butter is enabling our clients to secure the code
they own and depend on. It doesn't give us privileged access into anyone else's networks.
We engineered Gossamer to prevent abuse, even from ourselves.
The most benefit we'll ever be able to extract from the work we're doing (and have
been doing for years) is bragging rights for having done this work at all.