Skip to content

Latest commit

 

History

History
411 lines (293 loc) · 19.6 KB

2023-06-08-meeting.md

File metadata and controls

411 lines (293 loc) · 19.6 KB

Fractureiser Mitigation Meeting 2023-06-08

Agenda and minutes for the 2023-06-08 meeting on follow-ups and preventions

In the interest of keeping the meeting productive, we invited a narrow set of members from the community, mostly people working on mod repositories, and people who helped organize the incident response.

Recordings

The meeting has been recorded and edited to include speaking identifiers. You can watch the recording on YouTube or PeerTube.

Trascript

A copy of meeting transcipt may be found here.

Time

2023-06-08 16:00 UTC

Attendees

Meeting shepherds/drivers:

  • Emi (initial responder, organizer)
  • Jasmine (organizer)

Meeting secretary/minutes:

  • williewillus (incident journalist; Violet Moon)

Members of the Community (Alphabetic):

  • Adrian (Modrinth)
  • cpw (Forge)
  • Doctor Oyster (Overwolf Community Team Leader)
  • Emma (Modrinth)
  • Fury (Overwolf CEO)
  • gdude (Quilt)
  • Geometrically (Modrinth)
  • IMS (CaffeineMC)
  • Jared (BlameJared Maven)
  • kennytv (PaperMC)
  • Mikey (FTB)
  • modmuss (Fabric)
  • Slowpoke (FTB)
  • Starchild (Quilt)
  • timoreo (Prism Launcher)
  • ZekeZ (Prism Launcher)

Summary of Incident and Response

(this is mostly for the benefit of attendees to get them caught up quickly - this doesn't need to be read aloud or something)

fractureiser is a novel self replicating virus that infects Bukkit plugins, Forge Mods, Fabric Mods, and vanilla Minecraft jars. Infected jars, upon being loaded, will run as normal, but silently download a series of payloads that steal login tokens, stored browser passwords/payment information, and cryptocurrency. After a computer has been infected, every applicable jar file on the compromised system will be infected such that if they are shared and run on an another computer, the infection will spread. Compromised Curseforge login tokens were used to gain access to large mod projects and distribute infected jars to users.

Discussion and responses to the issue began in earnest on June 6th. Samples were gradually discovered, identified, and decompiled. The source of the payloads that propogate the malware and steal passwords/tokens was identified, and swiftly taken down by its host, Serverion. Infected jars are no longer able to progress or propogate the malware, but infections from prior to the node being taken down may still be active.

At time of writing, samples continue to be reverse engineered in the hopes that, should the attackers attempt to create a new iteration of the malware, its command and control nodes can be taken down as quickly as possible. On June 7th, the attacker attempted to create a new node, which was again swiftly taken down by its host. A web URL pointing to this now-defunct node has been found, and is being actively monitored.

Agenda

We have about an hour so let's spend 15 minutes on each topic max. I've organized the topics by my (williewillus') sense of how likely something actionable is to happen. The less realistic something is the lower down it is. If time constrains us, we may drop those topics.

Opaque Review Processes/Security by Obscurity

Discussion and Action Items

  • What does CurseForge/Modrinth do when reviewing a mod?
    • Insert "what do you do here" meme :)
  • What automated checks are being run?
    • ?
  • What automated checks should be put in place?
    • Static analysis?
    • Can we get mod repos to commit to these action plans?
  • Semi-OT: Can we get a checksum algorithm that is not MD5 on CF please? MD5 has been known broken for years and this made us wary of attempted editing/collision attacks for longer than necessary
    • What does Modrinth use?

Minutes:

Fury: CEO of overwolf. Eng is working on mitigations, so I'm here
Re: checks. From a high-level perspective. Don't want to be too open about the checks
because people can find ways to bypass. Understand that the ambiguity is a concern,
will share as much as I can.
All mods have auto checks. Manual checks for some.
Technically, start by running antivirus/antimalware scans. Check file type/adherence to file structure.
API actually supports SHA-1.
More clientside improvements to enforce this in a good way.
Manual checks when an auto check flags it, human reviews the mod. Changing text/images also involves manual review.
Continuously evolving process, have updated automated tests to look for this vuln.

Emi: Moving to Modrinth to keep things quick

Geo/Jay: Founder of Modrinth. Our moderation is pretty robust. New project: manual review. Check description, images. Also check the source code, esp. for new/unknown authors. Closed source, will occasionally decompile mods, but not often. Checks on backend for file structure. Malware in modded mc is generally never detected by antivirus. Have rejected mods in the past for suspicious code. Understand this process is flawed. After initial review, we rely on reports afterwards. Our code is open source, we want a transparent tool the community can use. Relies on community trust (known author), but this is also flawed.

Emi: When you check open source mods, do you verify the source corresponds to the jar?

Geo: No, but we could do something where users can upload the source for a verified badge, like what npm has recently does. Might not be most ideal.

Emi: Which hash is used by Modrinth

Geo: We use SHA-1/SHA-512. Avaiable in API. Launcher we're building validates them. Rehashed all files in SHA-512 a year ago.

Reproducible Builds

One thing that would have helped in the "scanning for tampered mods" part of the response, was if we knew a specific mod version only had one possible binary output. That is, reproducible builds given a source checkout. We could have simply rebuilt the mod in question from source, taken its hash, and compared it to the file under scrutiny to detect stage 0 infection, rather than going through the complicated exercise of trying to scan for signatures in the code.

How many of our mods have build scripts plugins or deps with unpinned floating -SNAPSHOT version specifiers in them?

I'm willing to bet all of them, because both Fabric and Forge have example mods that do just that.

Gradle plugins like Loom and FG traditionally did this so that they could push fixes and iterate quickly without user action, but it's important now to have auditability.

Supply chain attacks from random modding Mavens being hacked is a very real threat.

Discussion and Action Items

  • Stop promoting usage of -SNAPSHOT in example templates and documentation in favor of fixed version numbers
  • Update fabric-example-mod and forge MDK example to use Gradle checksum verification and/or locking?
  • Stand up a working group in Loom/FG to investigate reproducible builds when the gradle flag is set. There may be nondeterminism introduced by parts of the modding toolchain such as jar remappers. Each instance needs to be rooted out such that a mod builds reproducibly out of the box.
  • Crazy idea: F-Droid style, submit source and it gets built by the mod repo?
    • Expensive capacity wise, unlikely to happen.

Starchild: For Quilt, one issue here is signing. We can't reproduce a signature.

willie: But if the signature is distributed separately, should be fine?

Geo: We can never be 100% secure. We can spend a lot of time reviewing every file. One important thing is accessibility. Many people don't know how Gradle works/what reproducible builds are. Template mods exist, but even code signing is hard for mobile devs. We get thousands of submissions a day.

Fury: One thing we suggest, for authors that opt to go through extra verification have a tag on their mod for gamers to see to increase trust.

Mods Downloading External Files

Mods that download external files which contain executable code are vulnerable to a supply chain attack.

Some existing examples of mods that do this are:

  • Essential
    • Automatic updater downloads any updates without prompting user. If the Essential update servers are compromised, the malicious code will be downloaded upon launching minecraft.
  • Mods depending on owolib, such as gadget
    • Will prompt the user to download owo-lib, which will download the jar from Modrinth. If a malicious jar is added to Modrinth with the correct version, it can perform a supply chain attack.

Should one of the more popular mods, such as Essential, be compromised, it would allow malware to quickly propagate to millions of users.

Discussion and Action Items

  • Should mods uploaded to platforms (Modrinth, CurseForge, etc.) be permitted to download files containing executable code from external sources?
    • I think no, but how would we enforce this? Start the game with the mod and check it doesn't download anything new into the mods folder?

Fury: Manually review when we see this pattern happen

willie: But you can push malicious code later

Fury: look at the author's reputation and other attributes of safety.

Starchild: Not too down to ban downloading everything from the internet. See some legitimate usecases. We can mark the mod on the platform as downloading external code. Can have policies like must do over https, etc. We can draft it in the community.

willie: Clarification: I mean talking about downloading code, not data

Lambda: TODO(willie i missed first half) Quilt had a proposal to do dependency downloading. More generally, only allow downloading of things that were predeclared (e.g. with a hash). Even if we allow mods that do this, I will still stay suspicious.

Code Signing

Binary artifacts released to mod repos should be signed by their author.

Templates and help docs for e.g. Fabric Loom and ForgeGradle should make the UX as easy as possible to:

  • Build releases in CI
  • Sign jars in CI at build time
  • Upload the signed jars to mod repos in CI

Moving release creation to CI is already more than what many modders do (most people don't even tag their releases). Keep in mind many modders are amateur programmers that have not worked in industry and don't know what any of this is.

I can't stress how important it is for this to be as well documented, easy, and ready-to-use as possible this is, or else no one will use it.

Eventually, mod repos should require signatures for every single upload and reject any unsigned artifact outright. This will need a rollout period, and still require good education materials.

Discussion and Action Items

  • Where should public keys be held and associated with their authors? In other words, what prevents a hacker from replacing a public key as well as the signed binary at the same time?
    • In the Modrinth user account (behind 2fa)?
      • Without organization accounts this can be annoying for mods that can be uploaded by multiple people?
      • If Modrinth is compromised a hacker can replace the pubkey and uploade a new signed binary at the same time
  • PGP signing or java jarsigner stuff?
    • PGP is the standard and used for Maven Central. My impression is no one uses the java specific stuff. OpenSSH signing is also an option but is much newer and has no established ecosystem. PGP is probably the best choice, unfortunately. We can automate away all the crappy parts of it with helpers in Gradle plugins.
  • Prior art
    • Forge has had support for signature stuff for years and it's gone unused. What can we learn from it?
  • Launchers should likely include UI elements to indicate mod signature verification status.
  • Immediate followups
    • Modrinth and CurseForge design how public key association/storage will work
    • Mod loaders write extensive documentation about how to:
      1. Create releases in CI using git tags
      2. Automatically sign and upload artifacts from CI

Emi: this doesn't necessitate a required process

Starchild: What do we want to prevent here and where? Mod repo can just associate a key with the user and require it in upload, but enforcement/verification can also happen in the loader.

Jasmine: I recently uploaded to Maven Central which enforces code signing. Process was mostly painless via buildscripts, hardest part was finding the docs.

Emi: Relaying from text chat, what is the point of signing? The main reason is for identifying actors.

cpw: Wrote codesigning for Forge, been there around 10y. It might have detected stage0 and probably stage3 if the signature wasn't stripped. We've wanted people to do it. Most people in Forge use it these days. Fingerprints for signatures show up in debug logs, etc. That's the primary purpose. IMO needs to be pushed a layer up. There is no mechanism to distribute trust. A modder can tell their own jar is good, but the user doesn't. That requires a trust authority in the mc ecosystem. Setting one is a complex endeavour. Only viable actor is CF and Modrinth, since they already track the stuff you need to track. +1 on badging on the mod repos. Doing it on the clientside? We're far too late for that at the moment. If the entire ecosystem was signed, we could revisit.

Emi: Is it possible to have a collaboratively controlled third party CA? Adding two steps is a bit much.

cpw: You can sign with multiple CA's at once, so that isn't necessary

Fury: We can look into it.

cpw: If you're committing to it, that's awesome :)

Geo: IMO would be best if a trusted third party does it. Diff platforms have diff policies. Modrinth interested in this as well, but we have to move carefully.

Emi: Are you willing to work with CurseForge for a third party CA?

Geo: Yeah. Everyone should put their differences aside for security, even though Modrinth wasn't affected it still was indirectly since "

cpw: Would be nice to see movement after 10y

Sandboxing

Sandboxing the Minecraft process is another defense strategy that can be used to limit the blast radius of attacks coming from malicious mod code.

In general, it's a hard problem.

Minecraft should only need access to:

  • The internet, to authenticate, log into servers, etc.
  • Filesystem access to the .minecraft instance folder and its recursive descendants

The main thing of interest to lock down here is filesystem access.

On Windows, there does not seem to be any simple to deploy sandboxing system (besides "play bedrock"). The closest is Chromium's model which would require significant invasive changes to the game.

On macOS, we could use Apple's built in sandboxing system known as "Seatbelt". It is already enforced in apple's app store, but not external apps like Minecraft. However it is still not possible to use it to restrict network access to certain sites. Below is a image which shows its file permission config which we can use to restrict its access. This could prove to be good enough for what is needed to limit malware and attacks from the code. image

On Linux, we have a couple options. First is SELinux/AppArmor. These frameworks are infamous for being insanely hard to configure and as a result have not been widely deployed. However, the policies we seek are quite simple, so it's possible this could be an avenue.

In OpenBSD parlance, all we want to do is unveil(2) to all paths in the instance root, but unfortunately this interface isn't available in Linux.

Second, is using a system such as Flatpak.

timoreo: Executing untrusted code is always unsafe, regardless of what sandboxing you have. Sandboxing is always your last resort. We have a flatpak of Prism which is sandboxed and restricts a lot of access. Prevented this specific attack. Can be difficult, e.g. you might think you can restrict mic, but what if a mod adding voice chat wants it? Needs to be restricted but allow freedom for modders. Linux has a lot of techniques, can restrict pretty much everything.

ZekeZ: Main issue is Windows and macOS, especially Windows.

Emi: Concern for several mods. Locking down fs access can be problematic for mods that want global instance-independent configs

ZekeZ: What was the problem with EMI?

Emi: There is a user-facing option to store stuff in ~/.minecraft instead of the instance local one

timoreo: You can case-by-case allow files. Anyways, what about network?


Recap

kennytv: Hi from Paper. Some overlap with modding, and have our own plugin repo. Would be interested in one maintainable scanner for identifying current malware. There is a bunch right now. On signing, Sponge also had that for a while. They basically gave on it, because people didn't know how to use it. Very hard to get novices to understand it.

Lambda: Would like to see mod hosting to add more 2FA. Or even only allowing API/CI to upload. That can be a bit annoying but can avoid token stealing resulting in bad uploads. Currently don't know if Modrinth has 2fa. CurseForge has it but only for payouts. Want to see this expanded.

Geo: Were working on 2fa before this started. Will also allow scoped tokens (e.g. unsigned builds only from this token, signed builds only from this token) 2fa has a flaw which is your session token is still vulnerable to compromise. LTT YouTube was compromised by this. Planning on launching new auth stuff in a couple of weeks.

Emi: To point out, LPS authors did have 2fa.

willie: Reiterating the point about building, signing, releasing in CI

Geo: one thing platforms to do is a general checklist when they're first uploading a project. On Modrinth we have a guide for literally everything. When we added descriptions and icons to the new upload checklist, rates of doing so jumped up.

Emi: Recently implemented CI publishing. Just copied another Github workflow file. This isn't maintainable.

Lambda: CI is not a thing a first time modder will do. There's a learning curve to modding: experimentation, then publishing, then fix stuff after you get popular. It's very daunting for a beginner due to lack of documentation for modding workflows. CI not really in the example mod templates.

Jared: Host the BlameJared maven with lots of projects. Encourage people to use CI as much as possible. Default MDK just publishes artifact, not even source. Need more examples and guides to set up jenkins, etc.

willie: GH Actions is probably easier for beginners than Jenkins no?

Jared: Not familiar with GH does it now. All files on my maven are built locally, not sure how remote publishing would work.

Emi: For a lot of mods, Maven is not strictly necessary. Can use CurseMaven/Modrinth's maven. Useful for mods that don't intentionally expose themselves as API's.

willie: +1 that Maven's are second order, most mods are leaf content mods that don't need Maven publishing.

Jared: CurseMaven is not a long term option. No one noticed, but CurseGradle has been taken off Github. Its source is gone. Nobody has noticed or spoken up about it.

willie: overall action item recap

Fury: Thanks everyone. Overwolf has grown over the years. Some folks reached out and provided valuable information. If you have further information, feel free to contact us it would help a lot.