In ordinary arithmetic and algebra, there isn't any reasonable definition of n/0 because if it is allowed, then one can prove contradictory things.
Outside of proof systems/pure math, like in software you write, you can define n/0 to be anything you like, you just can't depend on being able to derive mathematically consistent results.
There are also more exotic math systems. Infinity is not literally a number (it would lead to contradictions), but mathematicians can happily deal with numbers systems augmented with infinity as an extra member -- one just has to step carefully, since it still isn't a number. The "number system" has one non-number member requiring special handling.
Similarly there are modern approaches to infinitesimals where there's a thing "e" which is not zero, but e^2 is zero. That is not normal arithmetic but when handled carefully it can be useful.
Outside of proof systems/pure math, like in software you write, you can define n/0 to be anything you like, you just can't depend on being able to derive mathematically consistent results.
There are also more exotic math systems. Infinity is not literally a number (it would lead to contradictions), but mathematicians can happily deal with numbers systems augmented with infinity as an extra member -- one just has to step carefully, since it still isn't a number. The "number system" has one non-number member requiring special handling.
Similarly there are modern approaches to infinitesimals where there's a thing "e" which is not zero, but e^2 is zero. That is not normal arithmetic but when handled carefully it can be useful.