/r/math
Please read the FAQ before posting.
Rule 1: Stay on-topic
All posts and comments should be directly related to mathematics, including topics related to the practice, profession and community of mathematics.
In particular, any political discussion on /r/math should be directly related to mathematics - all threads and comments should be about concrete events and how they affect mathematics. Please avoid derailing such discussions into general political discussion, and report any comments that do so.
Rule 2: Questions should spark discussion
Questions on /r/math should spark discussion. For example, if you think your question can be answered quickly, you should instead post it in the Quick Questions thread.
Requests for calculation or estimation of real-world problems and values are best suited for the Quick Questions thread, /r/askmath or /r/theydidthemath.
If you're asking for help learning/understanding something mathematical, post in the Quick Questions thread or /r/learnmath. This includes reference requests - also see our list of free online resources and recommended books.
Rule 3: No homework problems
Homework problems, practice problems, and similar questions should be directed to /r/learnmath, /r/homeworkhelp or /r/cheatatmathhomework. Do not ask or answer this type of question in /r/math. If you ask for help cheating, you will be banned.
Rule 4: No career or education related questions
If you are asking for advice on choosing classes or career prospects, please post in the stickied Career & Education Questions thread.
Rule 5: No low-effort image/video posts
Image/Video posts should be on-topic and should promote discussion. Memes and similar content are not permitted.
If you upload an image or video, you must explain why it is relevant by posting a comment providing additional information that prompts discussion.
Rule 6: Be excellent to each other
Do not troll, insult, antagonize, or otherwise harass. This includes not only comments directed at users of /r/math, but at any person or group of people (e.g. racism, sexism, homophobia, hate speech, etc.).
Unnecessarily combative or unkind comments may result in an immediate ban.
This subreddit is actively moderated to maintain the standards outlined above; as such, posts and comments are often removed and redirected to a more appropriate location. See more about our removal policy here.
If you post or comment something breaking the rules, the content may be removed - repeated removal violations may escalate to a ban, but not without some kind of prior warning; see here for our policy on warnings and bans. If you feel you were banned unjustly, or that the circumstances of your ban no longer apply, see our ban appeal process here.
Filters: Hide Image Posts Show All Posts
Recurring Threads and Resources
What Are You Working On? - every Monday
Discussing Living Proof - every Tuesday
Quick Questions - every Wednesday
Career and Education Questions - every Thursday
This Week I Learned - every Friday
A Compilation of Free, Online Math Resources.
Click here to chat with us on IRC!
Using LaTeX
To view LaTeX on reddit, install one of the following:
MathJax userscript (userscripts need Greasemonkey, Tampermonkey or similar)
TeX all the things Chrome extension (configure inline math to use [; ;] delimiters)
[; e^{\pi i} + 1 = 0 ;]
Post the equation above like this:
`[; e^{\pi i}+1=0 ;]`
Using Superscripts and Subscripts
x*_sub_* makes xsub
x*`sup`* and x^(sup) both make xsup
x*_sub_`sup`* makes xsubsup
x*`sup`_sub_* makes xsup
sub
Useful Symbols
Basic Math Symbols
≠ ± ∓ ÷ × ∙ – √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °
Geometry Symbols
∠ ∟ ° ≅ ~ ‖ ⟂ ⫛
Algebra Symbols
≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘∏ ∐ ∑ ⋀ ⋁ ⋂ ⋃ ⨀ ⨁ ⨂ 𝖕 𝖖 𝖗 ⊲ ⊳
Set Theory Symbols
∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟
Logic Symbols
¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ↔ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣
Calculus and Analysis Symbols
∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ
Greek Letters
Other Subreddits
Math
Tools
Related fields
/r/math
Hello r/math! I was thinking and came up with an idea(I don't know does it make any sense or have any application to anything and maybe someone has already come up with this before), but the concept play's with the idea of infinities. Here it goes:
Consider a cylindrical object. We can draw a line segment on the surface of the cylinder, perpendicular to the base and top of the cylinder, such that it touches all points on the surface along the line that is perpendicular to the base and top. We can draw infinitely many such line segments on the surface of the cylinder without any overlap. In this scenario, we have a bounded infinity (the set of points on the surface of the cylinder), within which there are bounded infinities (these line segments) an unbounded infinity of times.
So in a nutshell we have a bounded infinity(The set of dots on the surface of cylinder), that contains an unbounded infinite amount(the amount of these lines) of bounded infinities(The set of dots of the line).
TLDR: Bounded infinity containing unbounded infinite amount of bounded infinities.
6 Remaining Unsolved Millenium Problem
P VS NP
Riemann Hypothesis
Birch/Dyer Conjecture
Navier Stokes Conjecture
Yang Mills
Hodge Conjecture
Personally, I will go with:
First to Solve: Navier Stokes
Second: Hodge Conjecture
Third: Birch/Dyer Conjecture
Fourth: Yang Mills
Fifth: Riemann Hypothesis
Sixth: P VS NP
How about you mathematicians in this sub?
Which mathematical event will surprise you the most?
1.Odd Perfect Number/s Exist?
2.P = NP being proven true.
3.A number doesn't satisfy the Collatz Conjecture.
4.A number doesn't satisfy the Goldbach Conjecture.
5.Riemann Hypothesis is false/wrong
Which among these 5 do you think will surprise you the most if it is proven, discovered, or invented?
Hello guys, I want to know if you know where I can find and buy old books (pre 1920) such as
- A Treatise on the Integral Calculus, Vol. 1 and 2: With Applications, Examples, and Problems (Classic Reprint) by Joseph Edwards
- A Treatise on the Calculus of Finite Differences [1872 Revised Edition] by George Boole
- Cours de calcul différentiel et intégral by J.A. Serret Calcul intégral.
And some other books like this. I know I can find these books on amazon but I have bought old books online before and the quality was really subpar. I am preferrably interested in a physical store that may have these books in the US or that ships to the US (or latam). Thanks!
I'm working on a NLP problem and i have some cosine similarity scores ranging from floating values from -1 to 1 and i basically want to seperate it into 3 categories , basically +ve and -ve values and the 3rd one including data that is too close to 0 depending on a threshold say epsilon, problem is how do i mathematically define and justify this threshold? I tried numpy's quantiles but that just divides it equally but in my case i can have 70% negative values and 10% neutral and so on.. so i can divide just in equal ratios, Im writing a research paper so i cant just assume and say i divide the data at -0.05 and +0.05 because that would be random so what can i do?
In essence, I think I wish to create clusters or segments in 1D
So on his latest Joe Rogan podcast, Eric talks about recovering Einstein’s theory. What does it mean to recover a theory?
Every mathematician's Wikipedia shows that they started very young and had intelligence that seperated them. Isn't there any one who started late in his 20s?
Hey guys, I have a question about the mathematical history of orthogonal polynomials. In particular, I was wondering if anyone in this community might happen to know the nuances of legendre and chebyshev polynomials used during numerical interpolation. Both types of polynomials are orthogonal over [-1,1] and their definitions are very similar. What are their differences during interpolation? I haven't been able to find much literature about situations in which one type is clearly favorable over the other. Do such situations exist?
In your opinion as a math person (i.e student, teacher, researcher, etc.), are Gödel's Incompleteness Theorems of any value/importance? Are they relevant in your field of work/study? Have you encountered them in your study/work journey?
Hey fellow math enjoyers, I'm recently learning about tensor products and a video suggested that an easy way of understanding tensor products is connecting it to V* x W*. More specifically, looking at the isomorphism between V tensor W and V* x W*. I'm just confused on what kind of structure V* x W* is. Is it just the set of all bilinear maps from V x W to F? If so, how does the direct product link the two dual spaces?
Some background: I finished a pretty rigorous linear algebra course and is currently studying group theory.
Thanks in advance.
I'm going into my second year of engineering physics, thinking about specializing in quantum computing. I took calc 1 and 2 and didn't like them much in first year however loved linear algebra. I looking into learning more pure math in forms of abstract algebra and real analysis. So yeah I was wondering is quantum computing the field where pure math gets applied?
As I'm finishing my second semester as a freshman, I noticed something interesting about my studying habits.
Namely, my learning productivity is substantially higher when I study for my uni classes, compared to self-studying topics outside the current curriculum. I'm convinced this is mostly due to deadlines.
Let me elaborate. For example, when I study on my own, I may explore adjacent topics. Or leave some time for solving a problem I stuck with in background (to actually solve it myself and understand deeper). Or I may spend too much time going through a particular piece of theory/problem solution etc etc
To my untrained eye, this indeed seems like an absence of limited time frame.
So, how do you go about this in your studies?
I just don't see a way to actually follow the deadlines you set yourself, since this way the motivation is internal (passion for the subject) compared to "I will sacrifice comfort, give up on perfectionism and have little to no rest, but I will finish this assignment/prepare for an exam no matter what"
I am just asking a research question that I had today particularly in statistical inference. I am sorry if it is not well communicated as it was a question I had in the shower and curious to know.
I understand that in the various optimization algorithms like BFGS, newton raphson, we use the 2nd derivative (hessian) to inform us of the steps we are making of each update. In my context, I work with missing data.
My limited knowledge and research tells me that Hessian informs us of the curvature of the function. If my hessian matrix is taken with respect only to the observed data, how will it affect the curvature? (ie. how will it affect the direction of the updates being made to the parameters?)
Can you point me to any relevant research papers?
Thank you!
This is the class before college algebra. I'm doing a combined course of college algebra and algebraic literacy. I don't know what algebraic literacy would be. I'm looking for a good crash course before I start the class.
This recurring thread is meant for users to share cool recently discovered facts, observations, proofs or concepts which that might not warrant their own threads. Please be encouraging and share as many details as possible as we would like this to be a good place for people to learn!
In all stages of his life, there'd be a different actor, kid, adolescent, middle aged, old.
I was reading an old geometry book and it mentioned that the problem of monohedral convex pentagonal tilings was an open one. Of course, I checked Wikipedia and it said there that Michael Rao published a computer-assisted proof in 2017, but also claimed that even at the end of that year the paper had yet to be peer-reviewed fully. All other results I find date back further than December 2017, so I have no idea if the proof survived the scrutiny of the remaining parts. Does anyone have any further information?
weirdness(x) > weirdness(y) <=> [x studies things that seem more whimsical and fun than y (like knot theory) and x is more abstract than y]
Ok , so here's the context:
College class Calculus about (Differential equations , integration , .....)
I was fucking prepared , had done the previous exam , I k n e w my shet.
But yet , when I was in front of my sheet , I felt like , my mind was all "messy" and I couldn't focus very well.
Had an oral exam about the theory , it was pretty good but it bothers me that I couldn't display my actual knoweldge on the written exam.
Do you know how to overcome that kinda of thing ?
I am thinking about the two different notions of exponentiation:
(1) The exponential map on Riemannian manifolds
(2) Exponential of an operator, e.g. exp (t d/dx) f(x) = f(x + t)
Generally on R^(n), we have
exp (v · ∇) f(x) = f(x + v) ................ (*)
I am thinking whether we can generalise this statement to general manifolds.
Translating x to x + v in R^(n), under the usual metric, is the same as applying the usual exponential map, i.e. exp_x(v) = x + v, so I was thinking, if we want to generalise (*) to general manifolds, the right hand side might be like an exponential map on the manifold at x? So (*) might look something like
exp (v · ∇) f(x) = f(exp_x(v)) ................ (**)
on a general manifold. (But I don't know how to generalise the LHS - I imagine it might be something like a covariant derivative?) If that is the case, then the two notions of exponentiation are connected to each other in a more obvious way.
Is it even possible to generalise (*) to manifolds? Or are the two notions of exponentiation related in some other ways?
I myself am more like a physicist wondering about differential geometry, so if I have said anything wrong here, or there are some other references for me to read further, please let me know.
. I'm currently a math undergrad student and I'm doing some research on knot theory during the summer. My gf (non-math major) asked me if I liked it and the conversation ended up on why did we decide to study what we're studying. The more I talked about it, the more I started to realize I couldn't really pin down what it is that motivates me to learn math. I'm sure that math being "cool" or "solving a problem is really satisfying" are not main reasons why I am motivated to learn and do math, it's something deeper that I can't quite explain. What would your answer be?
I am a long-time r/math browser (mostly lurking /new). Browsing this subreddit is a part of my daily routine, to the point where I have read roughly 90% of all posts made here for the past four years. Being exposed to casual, yet still interesting and thought-provoking, mathematical discussion on any and all topics is one of the biggest reasons why I am able to maintain my passion for the subject. I have learned so many great pieces of math from browsing this sub, whether it be a historical anecdote, an insightful way to reframe something I know very well, or a window into the current state of branches of mathematics that I otherwise would avoid (sorry analysts).
That being said, due to the current state of reddit making public libelous claims about third party app developers, I no longer wish to use this website. I have found alternative platforms for all of my hobbies/interests except for this one. The current events are mildly political so I will try to ask my question in the most clear-cut way possible:
What internet platforms (other than this one) exist for casual, high-level mathematical discussion among practicing mathematicians? Bonus points if laymen are not discouraged from contributing.
There has been some discussion on the post about the sub blackout, but I feel like a broader conversation would be very helpful. I would also like to preemptively say that math stackexchange isn't for me, I feel like it's too formal and strict on what constitutes "worthwhile discussion".
This recurring thread will be for any questions or advice concerning careers and education in mathematics. Please feel free to post a comment below, and sort by new to see comments which may be unanswered.
Please consider including a brief introduction about your background and the context of your question.
Helpful subreddits include /r/GradSchool, /r/AskAcademia, /r/Jobs, and /r/CareerGuidance.
If you wish to discuss the math you've been thinking about, you should post in the most recent What Are You Working On? thread.
Hello, I’m a MS statistics student who recently got interested in an area of high dimensional statistics, which aims to find low dimensional structure in high dimensional data. One of the big methodologies in this space is “graphical models”, and extracting structured like graphs from data, and conducting inference on said graphs to understand noisy, complex data. My background in math is up to the level of Taos Analysis 1 and 2. I was wondering, what are some prerequisites I would need to tackling graph theory? And, if there are any books you guys recommend. Thanks!
Hi everyone! I made a study group last year which was a success, and I'm doing it again this year, in part due to a friend who wishes to learn stochastic analysis. It will be on discord and hopefully we'll have weekly/fortnightly meetings on voice chat. There will be one or two selected exercises each week.
Prerequisites include measure theoretic probability and at least some familiarity with stochastic processes. Discrete-time is fine. For example you should know what a martingale and a Markov process is, at least in basic setups (SSRW and Markov chains).
Topics will include: Quick recap on probability; stochastic processes; Brownian motion; the Ito integral; Ito's lemma and SDEs; further topics, time permitting (which could include financial models, Feynman-Kac, representation theorems, Girsanov, Levy processes, filtering, stochastic control... depends on how fast we get on, and the interests of those who join).
The goal of this study group is to get the willing student to know what a stochastic integral is and how to manipulate SDEs. I think we'll do Oksendal chapters 1--5, and for stronger students, supplemented by Le Gall. Steele is great as well, pedagogically, and can be used if things in Oksendal don't quite make sense on the first read. All three books have a plethora of exercises between them.
Finally, the plan is to properly start at the beginning of July. Please leave a comment or dm me and I'll send you the invite link. See you there!
I am well aware that this is not a Psychology subreddit, but as the matter is, for me, most related to Mathematics, I thought it fitting to set it down here. I am a last year high-school student. Until about March, I had been in a very depressive state, which saw me not doing any Mathematics at all, as well as some other activities. It relented, but ever since, for no apparent reason, in my head there has been a fear of Mathematics, a fear that all my knowledge is faulty, that I can never do anything right, and that everybody knows it better than me, even when it seems objectively absurd. Not to say that I am ill-instructed in Mathematics, I absolutely adore it, I love doing proofs, and I can find my way, and with great pleasure, around Analysis. My grades, although they by no means represent my knowledge, have also been excellent. At the same time, it appears that my skills in the field are waning, futher straining the situation. How do I stop all this? I want to major in Mathematics, become a researcher, learn also the history of the field, but this strange impression of inferiority precludes me from doing anything at all.
It is well known that there are possibly (infinitely?) many ways to prove a statement in mathematics and part of the art of math is to find as many creative solutions to a problem as possible. But usually one long proof that uses more elementary tools can be condensed in a seemingly nicer proof that uses basically lemmas and definitions that generalise the concepts of the first long proof. In my opinion this two differ not that much from each other. Is there a way to know or to define when two proofs are different from each other? Since they prove the same thing they of course will be related in some way. And going further is there a concept of an optimal proof (the shortest proof)? I hope this question is not too dumb. I have never had a course on logic.
I remember seeing a list of False Proofs when I was taking Discrete Maths and I found it to be very interesting and also helpful.
I'm going to be a UCA (undergraduate course assistant) for an algorithms course next semester (this is a theoretical sophomore/junior level algorithms course not like a data structures one) and I want to compile a bigger/cooler list of false proofs.
So, if anyone knows some common proof mistakes students make or some cool sneaky way to trick people in a proof send them my way! Bonus points if they're cs/algorithms related.
Thank you so much!
As a gift (or a reference if you're coming up with proofs of your own), this is the list I had: link.
E.g. imaginary numbers/complex analysis