My group makes a custom executable to reflash a hardware device we produce. We build it for Linux and Darwin.
Trying to get the program to work with our Mac users has become harder and harder. These are all internal developers.
Enabling developer mode and allowing Terminal execution isn't enough. Disabling the quarantine bit works - sometimes - but now we're getting automated nastygrams from corporate IT threatening to kick the laptops off the network. I'm exhausted. The emergency workaround, which I tell nobody about, is way less secure than if they just let us run our own software on our own computer.
I once really urgently needed `nmap` to do some production debugging ASAP. Unfortunately, the security tools would flag this immediately on my machine, as I knew this from previous experiments. Solution - compile my own binary from sources, then quickly rename it. I assume that this "workaround" was totally fine for sec department. At least production got fixed and money kept flowing.
> At least production got fixed and money kept flowing.
You were denied the tools to get your job done. You've put yourself at risk by applying an unapproved workaround.
Never ever do this (unless you hold substantial shares). Let the company's bottom line take the fall. If that's the only thing they care about, that's your only way to make the problem visible.
Unfortunately the real world isn't black and white. Yes, according to the company policies, I should watch the world burn and do nothing, while looking at the company bleeding money due to customers SLA being broken. Of course, after submitting a ticket to get nmap approved, which takes days. Extra points if I'm on oncall, then racking that sweet incident money is great.
But the underlying SRE culture here is that, if you know what you are doing and have a functioning brain of a responsible person, you'd be forgiven a jump over the fence, if it means putting out a fire on the other side of it. We aren't kids.
There’s a middle ground. Get the appropriate stakeholders involved in the decision, including security. Let security be the ones to keep the system down, if it cones to that. Or, let the business operations folks make the decision to go over security’s head. Either way, this is not something an engineer tasked with fixing an outage should be making the decision on.
Engineers _should_ have leeway in how they resolve issues. As I read, though, you have a company policy which explicitly disallows the action you needed to take to fix the problem (if I misread, my apologies). Getting the stakeholders involved is the responsible thing to do when policies need to be broken.
Ideally, the way this kind of situation gets handled should be documented as part of a break-glass policy, so there’s no ambiguity. If that’s not the case, though, the business should get to decide, alongside the policy maker (e.g.: security), whether that policy should be broken as part of an emergency fix, and how to remediate the policy drift after the crisis.
If you’re all tight enough that you’re allowed to make these kinds of decisions in the heat of the moment, that’s great, but it should be agreed upon, and documented, beforehand.
Well I found out the hard way that company culture or values can mean nothing if you don't CYA. Granted, the shop was small enough that our team was in charge of both the security policies and ops, but still, on one unfortunate occasion I've stepped outside my area of responsibility to "do what's right" and got punished. The next time I've been in a similar situation - well, I've walked away from the fire and grabbed the popcorn.
By the way, I'm still burnt out. This work is stressful. Don't let it take away what's already scarce for you.
xattr -cr <file> should clear the "download" extended attribute and make it as if the software was compiled on the machine itself, bypassing the ever-so-annoying Gatekeeper.
For binary patching: codesign --force --deep -s - <file> (no developer ID required, "ad-hoc signing" is just updating a few hashes here and there). Note that you should otherwise not use codesign as it is the job of the linker to do it.
Very aware of the attributes, unfortunately these machines are on a global corporate network so there are layers and layers of monitoring software to prevent internal and external attacks. Changing perm bits on an OSX executable is instantly noted and sent upwards as a possible security breach.
Last time we did this I had to spend a week explaining to management that Macs could actually run software other than PowerPoint and it was necessary for our job.
The local workaround that we use is to just spin up a Linux VM and program devices from there. The less legal workaround is using WebUSB and I'm afraid to even tell the necessary people how I did it, because it's sitting out on a public-facing server.
...and there's an apple developer support person, Quinn, who appears to be heavily if not solely dedicated to helping developers do binary signing/notarization/stapling correctly.
Quinn also has their email address in their sig so people can just reach out via email without even needing an Apple account, or if they prefer more confidentiality.
As someone who actually signs, notorizes and distributes desktop apps for macOS, I can safely say their documentation is less than ideal.
Maybe because I'm using Electron framework which makes things more complicated, but I don't really understand why there's is a difference between different types of certificates (Developer ID, Apple distribution, macOS distribution) and I had to guess which one to use everytime I set it up.
Also why is notorization a completely different process from code signing, and requires completely different set of credentials from it. Seems odd to me.
> Also why is notorization a completely different process from code signing
Because they do completely different things. Signing is a proof that you were the one to write and package that software; notarisation is an online security check for malware. If I recall, you still sign but do not notarise when distributing to the Mac App Store.
OMG, this. I was working on a tool to help integrate password managers on macOS and I got completely blocked by the notarizing requirements. Some things literally cannot be built for macOS as open source software, now.
I don't really think saying documentation exists says much when Apple is notorious for having documentation that's either borderline or downright useless. It's generally the norm that some random blog post from a decade ago is more useful than their documentation, and I say this from firsthand experience.
Can you sign and notarize your own software made for internal use with your own infrastructure? If so, then this is a valid response. If not, then this is an irrelevant response because the issue is going through Apple, not the process being difficult or undocumented. If I own the device, then I should be free to decide what the sources of authority over it are.
Edit: I haven't tested it yet, but it does seem that you can sign an executable with your own certificate (self-signed or internal CA-issued) however you can't notarize it. Right now, notarization is only required for certain kinds of Apple-issued developer certificates, but that may change in the future.
Anecdotally, I was not able to find any way to notarize software for internal use, without paying for a $99 developer account. Though I would have been willing to pay, I know that others who might want to build the software wouldn’t, so I abandoned my project. I suppose I could have maintained it as open source with the developer account required to build, but it seemed disingenuous to me at the time.
> I mean, come on.
Is that really necessary? Obviously there are enough people who did not know about, or find helpful, the resources you’re referring to, that we have people complaining on Hacker News. This isn’t exactly a novice’s forum. Perhaps the problem lies with visibility and accessibility of the support resources, rather than all of the people who have seen notarization as a hurdle to getting real work done.
btw, for those who don’t want to search, Quinn’s signature states:
“
Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"
I understand that you're doing it on principle, but for a software development team, 99$/year is a really minuscule price to pay to be able to build / notarise / distribute software.
Developers pay exorbitant amount of money for much lesser value, and the idea of putting your teammates at risk to stick it to apple is kind of sad bordering with negligence from a business POV.
The principle is what matters. The amount is not the issue. The issue is that there is a cost at all. "It's so cheap" is never an excuse for charging for something that should be free. In this case, running software you have no intent to charge for, on your computer. It's as if someone started charging $0.01/month for breathable air. "But $0.01 is trivial," would not excuse it.
It costs money, and isn't free, for a reason you're not acknowledging. I don't think it's a major profit center for Apple.
It's about setting a higher floor for malicious actors than "random botnet residential IP + a captcha solving service". It's about proving some semblance of identity through a card number and a transaction that goes through without a chargeback.
As the case upthread shows, there's plenty to dislike about a system that inhibits running code built for personal use. And it's obviously neither foolproof nor without collateral damage. Reasonable people can debate if it's worth it. But it still ought be acknowledged that the motivations are closer to the reason you have to identify yourself and pay a nominal fee to drive a vehicle on public roads.
I don't buy it. Or rather, I am willing to believe that some team at Apple has convinced itself that this makes sense, but they're wrong.
In particular, the security boundaries are nonsensical. The whole model of "notarization" is that the developer of some software has convinced Apple that the software as a whole (not a specific running instance) is worthy of doing a specific thing to the system as a whole.
But this is almost useless. Should Facebook be allowed to do various things that can violate privacy and steal data? What if the app has a valid reason to sometimes do those things?
Or, more egregiously, consider something like VSCode. I run it, and the fancy Apple sandbox helpfully asks me if I want to grant access to "Documents." The answer is really "no! -- I want to grant access to the specific folders that I want this workspace to access", but MacOS isn't even close to being able to understand that. So instead, one needs to grant permission, at which point, the user is completely pwned, as VSCode is wildly insecure.
So no, I really don't believe that MacOS's security model makes its users meaningfully more secure. At best, the code signing scheme has some value for attribution after an attack occurs, but most attacks seem to involve stolen credentials, and I bet a bunch just hijack validly-notarized-but-insecure software a la the VSCode example.
Notarization is not a trusted system on macOS - or rather, notarized binaries still have a "this was downloaded from the internet" prompt, and the user is meant to make a decision on whether it is trustworthy.
Notarization does some minimal checks, but is mostly about attaching a real identity so that maliciousness has at least some real-world consequences. The most obvious being that you lose the ability to get more apps notarized.
Actually the cost is not the issue (you are paying for it one way or the other), the issue is the authorization to do such an action on your (supposedly) own hardware.
Adding signing as a requirement can easily make what was once a very simple distribution mechanism into something much more complex - now you need manage signing certificates and keys to be able to build your thing.
In contrast to this point, as long as I use Xcode and do the same thing I've always done allowing it to manage provisioning and everything else, I don't have a problem. However, I want to use CI/CD. Have you seen what kind of access you have to give fastlane? It's pretty wild. And even after giving it the keys to the kingdom, it still didn't work. Integrating apple code signing with CI/CD is really hard, full of very strange error messages and incantations to make it "work".
I don't know about fastlane, since my CI/CD is just a shell script, and signing and notarising is as hard as (checking the script) running `codesign ...` followed by `notarytool submit ... --wait`
Yes, you need to put keys on the build server for the "Developer ID Application" (which is what you need to distribute apps outside of AppStore) signature to work.
You do not need to give any special access to anything else beyond that.
Anyway, it is indeed more difficult than cross-build for Darwin from linux and call it a day.
You seem to be comparing a single dev sending apps to the world vs a corporate team pushing to employees (if I get parent's case right).
In most cases, just involving account management makes the corporate case 10x more of a PITA. Doing things in a corporate environment is a different game altogether.
Do you distribute OSS software which requires notarizing? If so, have you found a way to let the community build the software without a paid developer account? I would be very interested in a solution which allows OSS development, relying on protected APIs without requiring that anyone who builds the app to have a paid developer account.
Code signing is absolutely disgusting practically and philosophically. It has very reasonable and good intent behind it, but the practical implementations cause great suffering and sadness both for developers (cert management, cost, tools) and end-users (freedom of computing).
The tool is built deep in our CI/CD chain. The whole thing is a house of cards built on a massive pile of tinder next to an open drum of kerosene. You want me to integrate XCode into that?
Last time I tried setting up an Apple developer license inside a large corporation, one that they paid for and not tied to me or my credit card, it was also a nightmare.
Who said anything about Xcode? The codesign tool is part of macOS, not Xcode. The CLI tool for notarization is bundled with Xcode, but you don't have to use it; they have an official REST API that you can use directly.
Sure it's trivial, but it is tacit acceptance that you need permission to make a program on their platform. Permission that needs to be renewed year over year. Permission to earn a living on this platform.
Permission that can be revoked for any reason, including being compelled by someone with more power than Apple.
Trying to get the program to work with our Mac users has become harder and harder. These are all internal developers.
Enabling developer mode and allowing Terminal execution isn't enough. Disabling the quarantine bit works - sometimes - but now we're getting automated nastygrams from corporate IT threatening to kick the laptops off the network. I'm exhausted. The emergency workaround, which I tell nobody about, is way less secure than if they just let us run our own software on our own computer.