/r/math

This subreddit is for discussion of mathematics. All posts and comments should be directly related to mathematics, including topics related to the practice, profession and community of mathematics.

This subreddit is for discussion of mathematics. All posts and comments should be directly related to mathematics, including topics related to the practice, profession and community of mathematics.

**Please read the FAQ before posting.**

**Rule 1: Stay on-topic**

All posts and comments should be directly related to mathematics, including topics related to the practice, profession and community of mathematics.

In particular, any political discussion on /r/math should be directly related to mathematics - all threads and comments should be about concrete events and how they affect mathematics. Please avoid derailing such discussions into *general political discussion*, and report any comments that do so.

**Rule 2: Questions should spark discussion**

Questions on /r/math should spark discussion. For example, if you think your question can be answered quickly, you should instead post it in the Quick Questions thread.

Requests for calculation or estimation of real-world problems and values are best suited for the Quick Questions thread, /r/askmath or /r/theydidthemath.

If you're asking for help learning/understanding something mathematical, post in the Quick Questions thread or /r/learnmath. This includes reference requests - also see our list of free online resources and recommended books.

**Rule 3: No homework problems**

Homework problems, practice problems, and similar questions should be directed to /r/learnmath, /r/homeworkhelp or /r/cheatatmathhomework. Do not ask or answer this type of question in /r/math. If you ask for help cheating, you will be banned.

**Rule 4: No career or education related questions**

If you are asking for advice on choosing classes or career prospects, please post in the stickied Career & Education Questions thread.

**Rule 5: No low-effort image/video posts**

Image/Video posts should be on-topic and should promote discussion. Memes and similar content are not permitted.

If you upload an image or video, *you must explain why it is relevant* by posting a comment providing additional information that prompts discussion.

**Rule 6: Be excellent to each other**

Do not troll, insult, antagonize, or otherwise harass. This includes not only comments directed at users of /r/math, but at any person or group of people (e.g. racism, sexism, homophobia, hate speech, etc.).

Unnecessarily combative or unkind comments may result in an immediate ban.

This subreddit is actively moderated to maintain the standards outlined above; as such, posts and comments are often removed and redirected to a more appropriate location. See more about our removal policy here.

If you post or comment something breaking the rules, the content may be removed - repeated removal violations may escalate to a ban, but not without some kind of prior warning; see here for our policy on warnings and bans. If you feel you were banned unjustly, or that the circumstances of your ban no longer apply, see our ban appeal process here.

Filters: Hide Image Posts Show All Posts

**Recurring Threads and Resources**

*What Are You Working On?* - every Monday

*Discussing Living Proof* - every Tuesday

*Quick Questions* - every Wednesday

*Career and Education Questions* - every Thursday

*This Week I Learned* - every Friday

*A Compilation of Free, Online Math Resources*.

*Click here to chat with us on IRC!*

**Using LaTeX**

To view LaTeX on reddit, install *one* of the following:

MathJax userscript (userscripts need Greasemonkey, Tampermonkey or similar)

TeX all the things Chrome extension (configure inline math to use [*;* *;*] delimiters)

`[; e^{\pi i} + 1 = 0 ;]`

Post the equation above like this:

`[*;* e^{\pi i}+1=0 *;*]`

**Using Superscripts and Subscripts**

x*_sub_* makes x*sub*

x*`sup`* and x^(sup) both make x^{sup}

x*_sub_`sup`* makes x*sub*`sup`

x*`sup`_sub_* makes x`sup`

*sub*

**Useful Symbols**

Basic Math Symbols

≠ ± ∓ ÷ × ∙ – √ ‰ ⊗ ⊕ ⊖ ⊘ ⊙ ≤ ≥ ≦ ≧ ≨ ≩ ≺ ≻ ≼ ≽ ⊏ ⊐ ⊑ ⊒ ² ³ °

Geometry Symbols

∠ ∟ ° ≅ ~ ‖ ⟂ ⫛

Algebra Symbols

≡ ≜ ≈ ∝ ∞ ≪ ≫ ⌊⌋ ⌈⌉ ∘∏ ∐ ∑ ⋀ ⋁ ⋂ ⋃ ⨀ ⨁ ⨂ 𝖕 𝖖 𝖗 ⊲ ⊳

Set Theory Symbols

∅ ∖ ∁ ↦ ↣ ∩ ∪ ⊆ ⊂ ⊄ ⊊ ⊇ ⊃ ⊅ ⊋ ⊖ ∈ ∉ ∋ ∌ ℕ ℤ ℚ ℝ ℂ ℵ ℶ ℷ ℸ 𝓟

Logic Symbols

¬ ∨ ∧ ⊕ → ← ⇒ ⇐ ↔ ⇔ ∀ ∃ ∄ ∴ ∵ ⊤ ⊥ ⊢ ⊨ ⫤ ⊣

Calculus and Analysis Symbols

∫ ∬ ∭ ∮ ∯ ∰ ∇ ∆ δ ∂ ℱ ℒ ℓ

Greek Letters

**Other Subreddits**

**Math**

- /r/learnmath
- /r/mathbooks
- /r/cheatatmathhomework
- /r/matheducation
- /r/casualmath
- /r/puremathematics
- /r/mathpics
- /r/mathriddles
- /r/mathmemes

**Tools**

**Related fields**

/r/math

1

Hi everyone,

I’ve been studying model theory in my spare time with Kirby’s invitation, sometimes it feels really fast and the exercises almost seem to come out of nowhere and it becomes awkward to attempt to solve (although usually there’s great reward when that happens)

Would you recommend a book to read on the side? For context I have a PhD in inverse scattering and a mathematics masters degree (although though more analysis biased than logic)

Thanks!

0 Comments

2024/06/12

19:44 UTC

19:44 UTC

18

When I come across topics in mathematical physics, it's usually stuff related to theoretical mechanics, or quantum mechanics. Maybe I ignore it because it's outside my area, but is there modern mathematical research related to electromagnetism? or perhaps other areas of physics, like nuclear or optics.

5 Comments

2024/06/13

00:28 UTC

00:28 UTC

222

Im a bio major working in a bio lab, and in preparation for a new project I was looking into some stat analysis methods for the process I plan to use—it used a Poisson distribution. The intuition behind the equation makes perfect sense, but I can not figure out why tf e is there.

I understand that it sees so much use for being its own derivative, but why would that make it just randomly pop up literally lol

44 Comments

2024/06/12

21:03 UTC

21:03 UTC

5

I know this is a bit niche, but having gone through the first book, I would really like to read the second volume of Terence Tao's Analysis, which covers metric spaces, multivariate calculus, Lebesgue integration, among other things. In the past I have enjoyed reading books with people rather than by myself; we could meet up once a week to talk about the material and do the exercises.

This is mostly review for me, i.e., I have covered some (most?) of this material before. If you are learning this stuff for the first time I could offer you some guidance.

If anyone is interested, please reply :)

0 Comments

2024/06/11

23:04 UTC

23:04 UTC

22

Hi everyone,

I don’t know how to best summarise my questions into a title, so that’s the best I could do :(

Like AG people do Hartshorne or Vakil, or analytic number theory people do Davenport. Combinatorics, esp. extremal combinatorics and additive number theory, also has such books, such as Yufei Zhao’s GTAC or Lovasz’s combinatorial problems. There’s a book like The Masterclass on Cauchy-Schwartz, which I believe was once mentioned in Thomas Bloom’s tweet.

While I understand that actual research problems usually appear in papers, but as there are no discrete math professors in my department, I found it somewhat hard to find suitable problems to work on, even if I wanted to.

So I guess my question is that is it necessary to work through these books to have a general idea of the common techniques and tricks in my interested areas, such as extremal, probabilistic, additive combinatorics, and TCS?

Many thanks!

Added context: I’m a graduate/master student and don’t have Olympiad / math competition background.

7 Comments

2024/06/12

18:29 UTC

18:29 UTC

80

The title of this post is purposely provocative because I want to be proven wrong and then motivated to learn the important parts of measure theory. I do honestly find it difficult, but in addition to that I've gained no insight from my cursory viewing of Wikipedia and textbooks. Also, my background is in statistics.

I'm sure that the original attempt to formalize analysis and probability within the measure theory framework was useful historically and otherwise, but that doesn't mean that for someone with more applied interests reviewing the foundation is relevant. To me, it just seems like pedantic bookkeeping. Let me give three examples:

- The Lebesgue measure allows us to integrate over pathological functions like the Dirichlet function, but those functions are not relevant to empirical reality. I've specifically failed at convincing friends that analysis at this level is interesting.
- Random variables are defined as measurable functions from a probability space (the sample space) to a measurable space (the event space). In essence, we're just allowing the sample space to be made up of things other than the actual outcomes the events will be made up of. We can tally up these things using the domain's probability measure, and the items no longer need to be real numbers. While the elements in the domain (the "weights" to be counted that comprise the probability) can now be more abstract, I still don't gain any insight from this. The elements don't correspond to any empirical model of reality anyway, so what's the problem with just making the events sets of outcomes like in the Kolmogorov axioms?
- The Radon-Nikodym theorem allows us to swap between measures if one is absolutely continuous with respect to the other (another convoluted-sounding but simple definition). All I get from this is that if one measure is roughly a subset of the other (i.e., it has measure zero at least everywhere the other one does), then we can integrate a function with respect to the larger one.

In both of these cases, I didn't learn anything new. Compare this to functional analysis or even abstract algebra, where there is clear application or insight to be had. With functional analysis, we can talk about reproducing kernel Hilbert spaces and use them to develop kernel methods in ML. With abstract algebra, we can gain new insight about the relationship between structure and commutativity through things like solvable groups.

I know my understanding is partial and I'm probably wrong about a lot, so thanks in advance for any discussion! So, what measure-theoretic concepts led to paradigm shift in the way you think, and which are relevant to real-world modeling?

109 Comments

2024/06/12

17:01 UTC

17:01 UTC

1

Hello, I am reading through a paper that gives the moment-generating function for the product of 2 normally distributed variables:

I fail to see how this is the moment-generating functions, as I've tried for many hours to come to this conclusion and can't. Any help would be much appreciated.

0 Comments

2024/06/11

20:45 UTC

20:45 UTC

4

I am very curious about Jean-Pierre Serre's mathematical views, but there are so many interviews about his Bourbaki activities and the running time is so long that it is very difficult to find Serre's evaluation of Weil and Grothendieck. So if you know of an interview video (or article) showing Serre's assessment of Weil and Grothendieck, I would appreciate it if you could add that link to your answer.

2 Comments

2024/06/12

03:23 UTC

03:23 UTC

3

Let R be any any ring and f be a polynomial in R[x]. If x-a is a factor of f(x) then f(a)=0. This seems to be true.

However, if f(a) = 0 then x-a may not be a factor of f(x). Can you help me find a example to show this?

3 Comments

2024/06/12

08:53 UTC

08:53 UTC

5

This recurring thread will be for questions that might not warrant their own thread. We would like to see more conceptual-based questions posted in this thread, rather than "what is the answer to this problem?". For example, here are some kinds of questions that we'd like to see in this thread:

- Can someone explain the concept of maпifolds to me?
- What are the applications of Represeпtation Theory?
- What's a good starter book for Numerical Aпalysis?
- What can I do to prepare for college/grad school/getting a job?

Including a brief description of your mathematical background and the context for your question can help others give you an appropriate answer. For example consider which subject your question is related to, or the things you already know or have tried.

11 Comments

2024/06/12

16:00 UTC

16:00 UTC

7

For a given equation f(x) and some interval i want to analyse the Newton method.

now i want to know how to find a minimum number, k of iterations of the Newton method. It seems to be (L)^k/(1-L)<ϵ . I had trouble finding the constant so I looked around on the internet and found a result that said that L = max{|f'(x)|} for x in [a,b]. There were some requirements for this to work, like f has to be differentiable (obviously). We are mostly dealing with continuous function in C^2 so would that mean this is a helpful result? Or are there other things I should clarify first before using this result?

2 Comments

2024/06/12

14:28 UTC

14:28 UTC

72

Anything other than the constructibilty of certain angles and shapes cause it’s already taken.

Ps: I’m a third year undergrad and I have to present a talk in my uni

27 Comments

2024/06/12

13:07 UTC

13:07 UTC

23

So, when I started getting familiar with the Laplace transform, one of the first things that I heard is that basically it is the expansion of Fourier transform into the s-plane. But over time, I started thinking about it, and I noticed that it is not entirely true. Fourier transform is rather easy to understand, because basically you start "probing" your function with a complex sinusoid, and the resulting function of your FT is the magnitude of the individual components plotted with respect to the frequency. If you take the LT however, it seems that the LT does not give any information about the magnitudes of your components, rather that it's just part of your signal. For example, if you plug in sin(4t) for the LT, you get a similar looking plot to the fourier transform, but instead of discernable values on the magnitude plot, you get 2 poles at -4 and 4. Is this because the LT is one-sided, while the fourier transform is 2 sided? I feel like I need better kind of intuition for LT. It seems that the LT just basically assigns a decaying complex sinusoid to whatever function you plug in, and wherever you have a pole in the s plane is a point of interest. I'm also interested if there is any useful information about the phase of your LT transformed function, knowing the fact that it is a complex function.

7 Comments

2024/06/12

12:26 UTC

12:26 UTC

431

I always see khan academy recommended as a source for learning maths like calc, multivariable calc, linear algebra, etc… but in my experience it’s always sucked whenever I’ve used it. For example last summer I tried learning multivariable calculus from khan academy, and it absolutely did not work, khan academy spends the vast majority of the time explaining concepts in arguably excessive detail and when it does ask you questions, they are so simple and require so little thinking to solve that you spend practically no time actually doing math, and when you, do it’s not really engaging and doesn’t make the person learning really grasp the ideas. In my experience following a lecture series/ textbook alongside something like a problem book is far more effective for self study than something like khan academy

204 Comments

2024/06/12

11:05 UTC

11:05 UTC

6

I want to do some 'research' (not academically) about rating sports teams that do not necessarily share a league, but has some "cross-polination" (some of the teams play against teams in different leagues but the majority don't). Specifically high school sports like Hockey. I have read about the Elo system but I am afraid that it won't have enough matches to properly calibrate all the teams.

6 Comments

2024/06/12

10:43 UTC

10:43 UTC

35

Chemist here. I know this is a question that several disciplines argue about. I know mine does. I prefer to say that I "discover" new chemistry for sure, but I know some chemists (including recent Nobel winners), who will say that they invent new reactions, concepts, techniques, etc. Even when there's a lot of engineering involved to get a system to behave the way you want it to, it still seems like the key phenomena/insights reported in a paper I want to write is something true about the universe that always was true, and was just waiting to be found. If a fellow chemist tells me they "invent" or "engineer" the things their lab works on, I start to make assumptions about their mentality and how they do research (not necessarily bad, but definitely different from me).

What's the opinion of you all? I've always found it to be "obvious" that math is discovered. There are too many examples where the facts are much richer than the definitions (and axioms) that went into them -- After all, even Cantor couldn't have anticipated all the weird properties of the set that he defined. And what about the Monster group? All that's needed conceptually to appreciate what it is is the definition of a (finite) group and the definition of a normal subgroup, and Galois had already understood these notions in the early 1800's. But it would totally blow his mind if someone could travel back in time and tell him about the completed classification of the finite simple groups.

Then again, there are some areas of math where the hard part is coming up with the appropriate definitions, and then the proofs are seemingly trivial. Stokes' theorem seems to be an example of that, and so it would appear that math is, in fact, something that needed to be invented in order to be able to make the statement rigorously. On the other hand, one could argue that it's a statement that should always have been "morally true" and was discovered in the guise of various special cases earlier on, and that it just took mathematicians a long time to find the right words to use to state it in fully general form....

I dunno, I suspect your answer will depend heavily on which branch/area/type of math you work in?

84 Comments

2024/06/12

01:10 UTC

01:10 UTC

4

I thought of this idea of a sequence of “Divisible by Position numbers” which is a set of numbers in which each number is divisible by it’s position in the sequence, obviously 1 as a start can make any length sequence, but for non-1 starts I found the first 2 length is 3-4, the first 3 length is 7-9, the first 4 length is 13-16, and the first 5 and 6 are 61-65 and 61-66, as of writing this I realized the at all the starting numbers are prime! So tell me what the smallest instances of 7,8,9 and so on sequences are, and if at any point try prime number theory stops!

14 Comments

2024/06/11

21:56 UTC

21:56 UTC

4

Why is the associated set of conditional probabilities of two random variables convex if C is a convex set of joint probabilities for (u,v)? By the way is there any set of joint probabilities not convex? If there is, what’s the prerequisite for it to be convex? Thanks for any ideas or reading materials

0 Comments

2024/06/11

15:58 UTC

15:58 UTC

0

I am not a mathematician, but I have read that a comfortable majority of mathematicians support the Platonist view (though I can't find the source for that poll at this moment) that mathematical objects are very real. Personally I find it unappealing with such ontological promiscuity, postulating infinities of infinities of objects. I can imagine that most professional mathematicians don't care that much about such questions, the answer probably won't make any difference to mathematics, or your career. Personally I think that numbers and other mathematical objects are constructed, and that math is art of constructing such things.

26 Comments

2024/06/11

17:12 UTC

17:12 UTC

8

I was wondering. If you took a digital image of a painting, say 'The Starry Night' by Vincent van Gogh, and computed its 2D Fourier transform, what kind of patterns or structures would emerge in the transformed image? Would the swirling brushstrokes translate into specific frequencies or symmetries? Has anyone already explored this connection between art and Fourier analysis, and if so, where can I find examples? Also, could we manipulate the Fourier transform to create new artistic interpretations of the original painting? I'm very curious about the interplay between visual art and mathematical transformations.

6 Comments

2024/06/11

10:15 UTC

10:15 UTC

5

It has been a while since I ended my bachelor. I firstly got lectures on ODE then on Vector Calculus. A cousin asked my for help in his homework about exact differential equations

M(x,y)dx + N(x,y)dy = 0

дM/дy = дN/дx

And I noticed that relationship is like in the Hessian Matrix so I am wondering if those equations are related to potential functions

2 Comments

2024/06/11

16:56 UTC

16:56 UTC

31

Recently, I got curious about the topic of *pattern recognition in mathematics*. Everyone who studied mathematics knows that it is important. Out of curiosity I searched online about *pattern recognition and maths*, to my surprise majority (some argued not with weak arguments, please give me a stronger argument) results boldly declared:

mathematics is pattern recognition and vice versa.

**Definition of Pattern Recognition**: Given an entirely new problem, solving or reframing a part (which will lead to the complete solution) or the whole using one's existing knowledge base of solved problems.

I even found an old book which claims as the title of the book suggests, Mathematics as a science of patterns.

I am really surprised. This may also means the below is true:

*If you take an immortal dude with excellent memory and pattern recognition and lock him up in a room and give him all the mathematical problems that has ever existed and ask him to solve them on the condition that he never formalise any of the methods.*

- Is he actually learning mathematics only relying on his excellent memory and pattern recognition?
- Is mathematics all about memorising patterns and their quick application?
- If we can teach a machine to recognise complex patterns won't it just destroy all humans at mathematics?

25 Comments

2024/06/11

16:18 UTC

16:18 UTC

2

Hi, I am taking a course on partial differential equations and I’m not a big fan of the lectures, do you guys know of any good online lectures covering this book?

0 Comments

2024/06/11

14:17 UTC

14:17 UTC

153

For me it was an explanation from my 1st grade of junior high school math teacher. I didn't really like math them but i really liked her lesson because she could prove EVERYTHING. And we were talking about positive and negative numbers and why negative multiplied with negative is equal to a positive number. The ancient people could figure out why positive multiplied with positive equals positive and why positive multiplied with negative equals negative butt the couldn't figure the above. And here's her explanation

First way-do the addition on thee patenthesis -6×[2+(-2)]=-6×0=0 But if we to the distibutive property -6×[2+(-2)]=-6×2-6×(-2)=-12-6×(-2) Now we know that -12-6×(-2) equals 0 from the first way so, -6×(-2) = 0+12 = +12 So that's why se told us, negative multiplied with negative equals positive

133 Comments

2024/06/11

13:23 UTC

13:23 UTC

31

Any links to recent papers, conferences and scholars making progress in this area?

6 Comments

2024/06/11

12:35 UTC

12:35 UTC

75

Edit:Guys,i think what i meant wasn't math concepts but concepts that are true to math of all levels.

The only way i can explain is this: imagine a person,who worked in different places at different times,did many various things and when doing those things,no matter what they were,most come down to more simpler concepts that are true to them all ultimately.

I'm sorry if my post came off as a low effort post,for those who think i did not read the rules-i did,just before posting but i thought my way of phrasing was understandable, clear,although now i wouldn't say so. English isn't my first language, therefore i hope you all understand that it wasn't my intention to make this question sound as vague and raw as possible.

131 Comments

2024/06/11

11:57 UTC

11:57 UTC

4

I just finished my master degree studying stability of some models of Schrodinger equation in 1 dimension.

And now I would like to generalize this results to R^2 or R^3, for example.

But most of the articles that I see being made in Schrodinger equations work using T or R as the space for the x variable and a natural question for me is why do not work in R^n or bounded domains?

It's specialty strange when I see a lot of articles in navier-stokes/Euler equations and most of than is in R^3.

3 Comments

2024/06/11

08:30 UTC

08:30 UTC