Fun fact: zero and numerals were not invented by the Arabs. The Arabs learnt the concept & use of mathematical zero, numerals, decimal system, mathematical calculations, etc. from the ancient Hindus/Indians.
And from the Arabs, the Europeans learnt it.
Persian scholar Al Khwarizmi translated and used the Hindu/Indian numerals (including concept of mathematical zero) and "Sulba Sutras" (Hindu/Indian methods of mathematical problem solving) into the text Al-Jabr, which the Europeans translated as "Algebra" (yup, that branch of mathematics that all schoolkids worldwide learn from kindergarten).
When someone says "it still means zero" about Tamil when responding to comments about Arabic, two languages which have no shared root and little similarity, what does that mean?
So is Gemini. but from it I gather there might be something interesting about a word that "loops back" (geographically) but evolutionarily speaking it was a reworking of _independent_ discoveries of "emptiness"
lol I never made that connection — in Turkish, zero is sıfır, which does sound a lot like cipher. Also, password is şifre, which again sounds similar. Looking online, apparently the path is sifr (Arabic, meaning zero) -> cifre (French, first meaning zero, then any numeral, then coded message) -> şifre (Turkish, code/cipher)
Nice! Imagine the second meaning going back to Arabic and now it's a full loop! It can even override the original meaning given enough time and popularity (not especially for "zero", but possibly for another full-loop word).
The Turkish password word may be the same used for signature? I suspect so, because in Greek we have the Greek word for signature but also a Turkish loan word τζίφρα (djifra).
Spouse of a linguist here. That is absolutely not true. To summarize a LOT, there are multiple languages that share common roots, which linguists classify into language "families". If you go to https://en.wikipedia.org/wiki/List_of_language_families#Spok... and sort the list by number of current speakers (which adds up to far more than the population of the world because so many people speak two or more languages), you'll find the top five language families are Indo-European (which includes most European languages, including English), Sino-Tibetan (which includes Chinese), Atlantic-Congo (which includes Bantu and many other languages spoken in Africa, most of which you probably won't have heard of unless you're a linguist or you live in Africa), Afroasiatic (which includes Arabic), and Austronesian (which includes Tagalog, which you might know by the name Filipino).
It might be possible to claim that the Afroasiatic languages are all derived from Arabic, but the only influence that the Arabic language has had on Indo-European languages such as English is via loanwords (like algebra, for example). This does not make English a derivative of Arabic any more than Japanese (which has borrowed several English words such as カメラ, "kamera", from camera) is a derivative of English. Borrowing a word, or even a few dozen words, from another language does not make it a derivative. English, while it gleefully borrows loanwords from everywhere, is derived from French and German (or, to be more accurate, from Anglo-Norman and Proto-Germanic).
Can I also add that "Arabic numbers" - the numbers we use today, are actually of Indian origin, the Arabs translated the Indian logic/math texts into Arabic, and Western society used the Arabic translations (and additions like those of "Algorithm")
I have it on consumer-grade authority that the Indians got them in turn from the Shang dynasty, decimal since ca.1200BCE. Thus proving conclusively that numeral systems naturally travel deasil. Ne'er let thine diʒits, goe widdershins.
Also as long as we are going down the terminology nerd rabbit hole: it's Arabic numerals, not numbers. Numbers refers to the abstract concept, numerals refers to the method one uses to write them down.
Yeah - I quoted that to show that it was normal usage rather than technical correctness - I also did the same for the name that I didn't have the correct spelling for as I wrote the comment - not sure if I should update it (with your input) or to leave it and let people work down the thread
I don't think determinants play a central role in modern advanced matrix topics.
Luckily for me I read Axler's "Linear Algebra Done Right" (which uses determinant-free proofs) during my first linear algebra course, and didn't concern myself with determinants for a very long time.
Edit: Beyond cofactor expansion everyone should know of at least one quick method to write down determinants of 3x3 matrices. There is a nice survey in this paper:
> I don't think determinants play a central role in modern advanced matrix topics.
Not true at all. It's integral to determinantal stochastic point processes, commute distances in graphs, conductance in resistor networks, computing correlation via linear response theory, enumerating subgraphs, representation theory of groups, spectral graph theory... I am sure many more
Suppose you have (let's say) a 3x3 matrix. This is a linear transformation that maps real vectors to real vectors. Now let's say you have a cube as input with volume 1, and you send it into this transformation. The absolute value of the determinant of the matrix tells you what volume the transformed cube will be. The sign tells you if there is a parity reversal or not.
Form a quadratic equation to solve for the eigenvalues x of a 2x2 matrix (|A - xI| = 0). The inverse of a matrix can be calculated as the classical adjugate multiplied by the reciprocal of the determinant. Use Cramer's Rule to solve a system of linear equations by computing determinants. Reason that if x is an eigenvalue of A then A - xI has a non-trivial nullspace (using the mnemonic |A - xI| = 0).
It gives it a different implication. As I read it, an article titled "Lewis Carroll Computed Determinates" has three possible subjects:
1. Literally, Carroll would do matrix math. I know, like many on HN, that he was a mathematician. So this would be a dull and therefore unlikely subject.
2. Carroll invented determinates. This doesn't really fit the timeline of math history, so I doubt it.
3. Carroll computed determinates, and this was surprising. Maybe because we thought he was a bad mathematician, or the method had recently been invented and we don't know how he learned of it. This is slightly plausible.
4. (The actual subject). Carroll invented a method for computing determinates. A mathematician inventing a math technique makes sense, but the title doesn't. It'd be like saying "Newton and Leibnitz Used Calculus." Really burying the lede.
Of course, this could've been avoided had the article not gone with a click-bait style title. A clearer one might've been "Lewis Carroll's Method for Calculating Determinates Is Probably How You First Learned to Do It." It's long, but I'm not a pithy writer. I'm sure somebody could do better.
"How Lewis Carroll Computed Determinates" is fine and not clickbait because it provides all the pertinent information and is an accurate summary of its contents. Clickbait would be "you would never guess how this author/mathematician computed determinants" since it requires a clickthrough to know who the person is. How is perfectly fine IMO to have in the title because I personally would expect the How to be long enough to warrant a necessary clickthrough due to the otherwise required title length.
I forgot that cipher used to have a different meaning: zero, via Arabic. In some languages it means digit.
https://en.wikipedia.org/wiki/Hindu-Arabic_numeral_system
Persian scholar Al Khwarizmi translated and used the Hindu/Indian numerals (including concept of mathematical zero) and "Sulba Sutras" (Hindu/Indian methods of mathematical problem solving) into the text Al-Jabr, which the Europeans translated as "Algebra" (yup, that branch of mathematics that all schoolkids worldwide learn from kindergarten).
https://www.open.ac.uk/blogs/MathEd/index.php/2022/08/25/the...
I think it means HN is full of misleading ideas.
We use “arabic” numerals around the world. So use of an Arabic loan word is unsurprising.
Arabic -> Tamil <- Arabic - Sanskrit
https://en.wikipedia.org/wiki/0#Etymology
What does it mean when someone creates a new account for posting contradictory comments?
Sushi is now an English word. So is hummus, etc.
- cifru -> cipher
- cifră -> digit
https://en.wikipedia.org/wiki/Royal_cypher
Spouse of a linguist here. That is absolutely not true. To summarize a LOT, there are multiple languages that share common roots, which linguists classify into language "families". If you go to https://en.wikipedia.org/wiki/List_of_language_families#Spok... and sort the list by number of current speakers (which adds up to far more than the population of the world because so many people speak two or more languages), you'll find the top five language families are Indo-European (which includes most European languages, including English), Sino-Tibetan (which includes Chinese), Atlantic-Congo (which includes Bantu and many other languages spoken in Africa, most of which you probably won't have heard of unless you're a linguist or you live in Africa), Afroasiatic (which includes Arabic), and Austronesian (which includes Tagalog, which you might know by the name Filipino).
It might be possible to claim that the Afroasiatic languages are all derived from Arabic, but the only influence that the Arabic language has had on Indo-European languages such as English is via loanwords (like algebra, for example). This does not make English a derivative of Arabic any more than Japanese (which has borrowed several English words such as カメラ, "kamera", from camera) is a derivative of English. Borrowing a word, or even a few dozen words, from another language does not make it a derivative. English, while it gleefully borrows loanwords from everywhere, is derived from French and German (or, to be more accurate, from Anglo-Norman and Proto-Germanic).
https://en.wikipedia.org/wiki/Hindu%E2%80%93Arabic_numeral_s...
https://terrytao.wordpress.com/2017/08/28/dodgson-condensati...
I loathed it and it put me off wanting to get into advanced matrix topics.
Luckily for me I read Axler's "Linear Algebra Done Right" (which uses determinant-free proofs) during my first linear algebra course, and didn't concern myself with determinants for a very long time.
Edit: Beyond cofactor expansion everyone should know of at least one quick method to write down determinants of 3x3 matrices. There is a nice survey in this paper:
Dardan Hajriza, "New Method to Compute the Determinant of a 3x3 Matrix," International Journal of Algebra, Vol. 3, 2009, no. 5, 211 - 219. https://www.m-hikari.com/ija/ija-password-2009/ija-password5...
Not true at all. It's integral to determinantal stochastic point processes, commute distances in graphs, conductance in resistor networks, computing correlation via linear response theory, enumerating subgraphs, representation theory of groups, spectral graph theory... I am sure many more
Given that Jabberwocky is also quite readable, we shouldn't be too astonished.
You can manually edit it back in.
1. Literally, Carroll would do matrix math. I know, like many on HN, that he was a mathematician. So this would be a dull and therefore unlikely subject.
2. Carroll invented determinates. This doesn't really fit the timeline of math history, so I doubt it.
3. Carroll computed determinates, and this was surprising. Maybe because we thought he was a bad mathematician, or the method had recently been invented and we don't know how he learned of it. This is slightly plausible.
4. (The actual subject). Carroll invented a method for computing determinates. A mathematician inventing a math technique makes sense, but the title doesn't. It'd be like saying "Newton and Leibnitz Used Calculus." Really burying the lede.
Of course, this could've been avoided had the article not gone with a click-bait style title. A clearer one might've been "Lewis Carroll's Method for Calculating Determinates Is Probably How You First Learned to Do It." It's long, but I'm not a pithy writer. I'm sure somebody could do better.