Yes, this is what I was going to comment, and adding that it was funny they used a coal miner as an example since my family side that has the mood disorders were all coal miners in central PA.
I think the person who came up with this shouldn't be fired, the person who _approved_ it should be reprimanded.
There's some intersection point between who "owns" the wallet and who is coming up with ways to generate marketing revenue.
Whoever lives at that intersection point is the real shot caller here aren't they?
Imo you don't fire people for generating bad ideas, that just creates a culture of not thinking outside the box. But the person who is filtering those ideas is the critical lynch pin.
> Imo you don't fire people for generating bad ideas,
If an idea is that bad, at the very least they should be transfered into a role that doesn't involve coming up with good ideas, since obviously that is outside of their skill set. And what's the argument for not firing the chain of people who approved it? Their job was to stop bad ideas and they catastrophically failed.
> at the very least they should be transfered into a role that doesn't involve coming up with good ideas, since obviously that is outside of their skill set.
Proposing one bad idea is not unusual for people whose job is idea-driven. When ideas are the primary currency of your occupation, you'll necessarily generate some losers. But in a company of Apple's size, that's why you rely on colleagues and - critically - a more robust approval process to move from idea to deliverable.
I hate your idea of firing (from org. or role) the idea person based on one bad idea. I don't hate the idea of firing (from org. or role) the leaders accountable for getting this idea into the world.
> Why shouldn’t it be possible for people to lose their jobs?
This is a strawman argument that seems made in bad faith, but I'll bite anyway: I am not saying that no single bad idea or mistake should result in the loss of a job. I am saying that most of the time such a response would be an extreme reaction, especially when directed at the lower-level source of the ideas vs. the more senior accountable parties who are paid to know better.
Magnitude matters, as does accountability. Creating this world of extremes where one mistake of poor idea leads to termination is a pretty quick way to a toxic and non-productive work environment. Enact accountability where it sits, not across the entire chain.
I think you and I are saying the same thing honestly.
The parent seems to be of the mind that it's never a viable option for someone to lose their jobs for something; which I find an extreme position in itself.
I'm not sure how this context is lost, as precisely this point is what I'm getting at. I'm not jumping to extremes as some imply (including you), I'm saying it should be on the table for the most hopeless egregious offences.
You're seriously comparing a single advertisement to crimes like murder? Crimes that land you in prison are generally crimes that even children can understand are wrong. You're using "extremely poor decision" for 2 wildly different things, and if you think they're remotely equivalent, perhaps you should reflect on why you think that.
I am seriously suggesting that a single bad decision (like taking some money from the cash register) can land you in prison, why do we hold jobs to a higher standard?
Learning from our mistakes is one thing, slip ups happen after all, but I’m just drawing a comparison to “a single misjudgement”.
If you don't know societies values (stealing is wrong) or a companies values (tarnishing the brand by looking cheap and desperate) the outcome should probably be the same: expulsion or exclusion.
Also, don’t go to the most extreme negative interpretation of what someone says, it’s against guidelines.
Either you’re suggesting jail is too punitive a punishment or that being fired should never be a viable option.
I’m not saying we should jump to extremes, I’m saying that the option should be on the table if you violate the core principles of the company, especially in a way that causes loss of consumer trust.
Whats the difference between defrauding Ford out of $200M and causing $200M in damages because I decided that every new Ford will include the word “I solemnly swear I will shit on the American flag when requested”?
In essence, in either case I am putting my own needs above the needs of the company and above the needs of the consumer - in a way that undermines future sales for the company too.
There’s bad ideas like “it wasn’t possible to execute this the way we thought we could”, and bad ideas like “this goes against the core values of what this company is”.
The first is something that might have gone better in better circumstances, so it’s a learning opportunity. The second shows you either don’t understand the company and decided to carry on despite that, or you just don’t care about the company, but either way it reflects poorly enough on an individual that a firing should be on the table.
You definitely fire people for pitching ideas that are against the ethos of the company. Otherwise you have no culture. It shouldn’t come down to one approver on the wallet side to see how dumb this was
Yes, but there’s nuance. We each assume a version of events and nobody really knows. In my experience, big tech companies attract a certain type of person (among others) who will not only think of stuff like this, but actively fight for it and consequences to the long term be damned. VPs who actually approve this stuff will have limited time to think about it and a lot depends on the proposal.
This looks like a group PM level decision. Bluntly, at that level we get paid enough to exercise good judgement.
That's fine if actors are not doing much (e.g. just pathfinding and shooting), but is likely to fall apart long before scripting becomes a thing, since there are a lot of tasks where different actors do want to mutate the same object.
This is AI, having multiple actors plan to gather the same resource etc isn’t necessarily a flaw. It does however result in meaningful game design decisions getting tided up in how the game engine is designed.
This is a great example. It's probably pretty insignificant from a players perspective in most contexts. And you could most certainly design some acceleration structures to specifically handle AI convergence specific to your use case.
Two, surely? The previous one still being used and the new one being written.
(Note that this is how most rendering artifacts were fixed long ago - the on screen and the off screen buffers were swapped, so nobody would "see" in progress scenes)
Using assembly is not really more precise in terms of solving the problem. You can definitely make an argument that using a higher level language is equally if not more precise. Especially since your low level assembly will be limited to which architectures it can run on, you can state that the c++ that generates that assembly is "more precisely defining a calculator program".
I agree with your general point, but C++ isn't a great example, as it is so underspecified. Imagine as part of our calculator we wrote the function:
int add(int a, int b) {
return a + b;
}
What is the result of add(32767, 1)? C++ does not presume to define just one meaning for such an expression. Or even any meaning at all. What to do when the program tries to add ints that large is left to the personal conscience of compiler authors.
Precision is not boolean (present or absent/0 or 1). There may be many numbers between 0 and 1. Compared to human languages, programming languages are much more precise that makes the results much more predictable in practice.
I can imagine OS being written in C++ and working most of the time. I don't think you can replace Linux written in C with any number of LLM prompts.
LLM can be a [bad so far] programmer but a prompt is not a program.
Using code may not be more precise in terms of solving a problem than english. Take the NHS. With better AI, saying build a good IT system for the NHS may have worked better than this stuff https://www.theguardian.com/society/2013/sep/18/nhs-records-...
I was stoked when I saw these headlines cause I generally prefer Oakley to Ray Ban in terms of style, but these look nothing like Oakley's! Personally I don't like round glasses, I like more square glasses. Dang!
Yeah, I am sure that some fo the users may have higher skill set than authors. But majority of users suck at designing. And would add more fuel to the problems. And any refactoring efforts
Just signed a contract with a lawyers pen which was too heavy on ink so I got ink all over my hands. Was a pain to put my hat on after without getting ink on it. Seriously considered bringing my own pen before hand. Guess I will next time.
Not only is this pen ubiquitous, but it's ink flow is usually pretty light, which makes it not smear on your hands or the page.
Assuming the stack trace is generated by walking up the stack at the time when the crash happened, nothing that works like a C function pointer would ever do that. Assigning a a pointer to a memory location doesn't generate a stack frame, so there's no residual left in the stack that could be walked back.
A simple example. If you were to bind a function pointer in one stack frame, and the immediately return it to the parent stack frame which then invokes that bound pointer, the stack that bound the now called function would literally not exist anymore.
As someone who's literally done this in games over the last ten years, this sentence just reeks of "apple fan boy". It's not delightful. It's not sprinkling. No android person says this, and the dev experience of using xcode is just bad. Writing apple code sucks. It breaks constantly. The iOS integration layer is always the most brittle piece of your system because apple forces you to use xcode, their complicated code signing scheme and they constantly break their own apis and deprecate stuff. Idk why this "sprinkle in delight" got under my skin. Just seems like it's straight from a corny WWDC talk. We can't give apple a pass 100% of the time because the MacBook is the best laptop. We need to tell it like it is.
reply