Photograph via snooOG

Welcome to /r/ComputerScience!

We're glad you're here.

This subreddit is dedicated to such Computer Science topics like algorithms, computation, theory of languages, theory of programming, some software engineering, AI, cryptography, information theory, computer architecture etc.


  1. Content must be on-topic
  2. Be civil
  3. No career, major or courses advice
  4. No advertising
  5. No joke submissions
  6. No laptop/desktop purchase advice
  7. No tech/programming support
  8. No homework, exams, projects etc.
  9. No asking for ideas

For more detailed descriptions of these rules, please visit the rules page

Related subreddits


  • Header image is found here.
  • Subreddit logo is under an open source license from lessonhacker.com, found here



363,926 Subscribers


Question About Registers

Hello everyone. There is a misunderstanding I have somewhere that I would like to clear up.

I know that CPU registers are very fast and small and we can work with registers by writing assembly.

Here is where my misunderstanding/what I don't get lies: when I was taking my Architecture course, we had assignments where we had to program simple programs in assembly, like, say, a simple sort or something.

If a program is running on the machine already, say I have a chat client running in the background on the machine, are the registers not in use running that program? How is it that I can write a sorting program in assembly moving values around to registers if the registers are already working with other data? Is there somehow no overlap?

What am I missing here?

If I want to MOV some value into some register like eax or something writing a program in assembly, how is there no other information there already such that I am overwriting or affecting other programs that are running?

20:03 UTC


How computers measure time

Can someone explain this to me? I've been told that there is a chip that has a material that vibrates at a certain frequency when a certain current is passed through it, and when you pass a premeasured current, you just gotta measure the amount of oscillations to "count" time. But that's an inaccurate method, I've been told there's other methods used that are more precise, but no one is able to explain to me how those works. Please if you know this help.

19:41 UTC


Learning a new skill

Hey guys,

Wanted to ask what a good computer language would be to learn for first timers. I had one c++ course in college that went over some of the basics of writing code and I liked solving problems in that way. I was hoping to develope a new skill for fun and maybe make a career change in the future.

18:14 UTC


Recommendations of Hackathons, GameJams, Tech Conferences, and similar events?

I'm interested in many fields of CS so any event that you think is interesting, I will be looking it up. Though currently, I am interested the most in 3D graphics generation, data visualization, game dev, AR/VR and AI. I also have some experience in web dev, but not as much interest(still, will also be looking them up at least once).

Oh and an extra question: Which subreddit(s) would you recommend for being involved in and informed about these events?

16:39 UTC


Pumping lemma Question

I have 2 languages the first one is regular and the second one is non-regular based on what our teacher mentioned. The languages are:

  1. {w∈Σ∗|w has an equal number of occurrences of "ab" and "ba" as substrings }

  2. {w∈Σ∗|w has an equal number of occurrences of "as" and "bs" as substrings }

By using pumping lemma I can prove that there will be an imbalance of substring as n in y^n increases which proves that both languages are non-regular. Can someone explain to me how my prof is correct?

20:57 UTC


/r/ComputerScience will be going dark starting June 12th in protest against Reddit's API changes which will kill 3rd party apps & tools

Update (June 16th, 2023):

This subreddit remains closed to new submissions and comments as part of the ongoing protest over Reddit policy changes. However, we've chosen to switch the subreddit to read-only, so that existing user contributions will not be censored.

What's going on?

A recent Reddit policy change threatens to kill many beloved third-party mobile apps, making a great many quality-of-life features not seen in the official mobile app permanently inaccessible to users.

On May 31, 2023, Reddit announced they were raising the price to make calls to their API from being free to a level that will kill every third party app on Reddit, from Apollo to Reddit is Fun to Narwhal to BaconReader to Sync.

Even if you're not a mobile user and don't use any of those apps, this is a step toward killing other ways of customizing Reddit, such as Reddit Enhancement Suite or the use of the old.reddit.com desktop interface.

This isn't only a problem on the user level: many subreddit moderators depend on tools only available outside the official app to keep their communities on-topic and spam-free.

What's the plan?

On June 12th, many subreddits will be going dark to protest this policy. Some will return after 48 hours: others will go away permanently unless the issue is adequately addressed, since many moderators aren't able to put in the work they do with the poor tools available through the official app. This isn't something any of us do lightly: we do what we do because we love Reddit, and we truly believe this change will make it impossible to keep doing what we love.

The two-day blackout isn't the goal, and it isn't the end. Should things reach the 14th with no sign of Reddit choosing to fix what they've broken, we'll use the community and buzz we've built between then and now as a tool for further action.

What can you do?

  1. Complain. Message the mods of r/reddit.com, who are the admins of the site: message /u/reddit: submit a support request: comment in relevant threads on r/reddit, such as this one, leave a negative review on their official iOS or Android app- and sign your username in support to this post.

  2. Spread the word. Rabble-rouse on related subreddits. Meme it up, make it spicy. Bitch about it to your cat. Suggest anyone you know who moderates a subreddit join us at our sister sub at r/ModCoord - but please don't pester mods you don't know by simply spamming their modmail.

  3. Boycott and spread the word...to Reddit's competition! Stay off Reddit entirely on June 12th through the 13th- instead, take to your favorite non-Reddit platform of choice and make some noise in support!

  4. Don't be a jerk. As upsetting this may be, threats, profanity and vandalism will be worse than useless in getting people on our side. Please make every effort to be as restrained, polite, reasonable and law-abiding as possible.

20:56 UTC


Best Practices using LaTeX.

I have just finished my first seventy page thesis in LaTeX and would like to share my tips with you on your way and would also like to hear your tips for using TeX here below the post.

I started TeX locally on my computer with TeXShop. Unfortunately, my computer broke down while I was working. I could only restore parts of my work through my backups. Partly I wanted to work on other computers. For me personally, Overleaf (https://www.overleaf.com/) has turned out to be a very good companion as an editor in the browser. Overleaf also gave very good formatting tips during the work with regard to creating tables, missing parameters using BiBTeX and much more.

I used Zotero (https://www.zotero.org/) as library management software. In particular, I quickly lost track of what was relevant to my work and where. With the browser add-on, I was able to save the sources relatively quickly and create my bibliography in BiBTeX format.

Make backups of your interim work at any time. Even if Overleaf saves your results online, you should create backups that contain the entire source code. The Pro version even offers Git integration.

To create notes for my work in the meantime I can strongly recommend the Todo package (to be imported with \usepackage{todonotes}). Using the command \todo{your note} you can then mark corresponding paragraphs at the right place.

Since LaTeX sometimes places graphics elsewhere, I can also recommend the package placins. With its help you can limit the placement to certain sections. The package can be imported using (\usepackage[section]{placeins}) with the parameter section for section-limiting.

My thesis also has programming code in the appendix. To format it code-wise I highly recommend the package listings (to be imported using \usepackage{listings})

These were my biggest helpers, I hope I can be of some help to one or the other of you. If you are also currently working on a TeX project. Otherwise, I am very curious about your tips.

Peace out.

15:01 UTC


Any CS books which present their subject chronologically?

I recently read calculus reordered and Real Analysis: A radical approach by David Bressoud and I really loved the idea of presenting the ideas of a subject as they developed through history, presenting the historical background of each part of the theory.

I am curious if there are any other books which are like that (books that present their material chronologically and try to explain the rationale behind each advancement in the theory, not cs history books)

I would also appreciate it if you could suggest a book in adjacent fields (math, etc)

14:19 UTC


I don't understand the Halting Problem

The key to the solution, from what I've got so far, is imagining a machine that can detect whether or not another machine halts. We'll call that machine H, that takes as input a machine Q

Then, with some simple logic, we can make a machine that halts when H detects Q doesn't halt, and that doesn't halt when it detects Q halts. That'd be Hn

So, with this machine Hn, the paradox would emerge by doing Hn(Hn), since now, if it were to halt, it shouldn't halt, but if it doesn't halt, it should halt. Paradox.

But, one key thing I don't get: Doesn't the internal Hn also need an input? And wouldn't all of the subsequent machines need an input as well, until we reach a static program?

So instead of being Hn(Hn), shouldn't the paradox only arrive if you do Hn(Hn(Hn(...))) forever? and at that point the paradox isn't thanks to the algorithm, but rather because it's a supertask.

The same could be said if you build a machine that takes another machine (one that halts) as an input outputs 0 when the original outputs 1, and outputs 1 when the original outputs 0, let's call that one N

N(N) should also be a paradox, no? After all, if N outputs 0, it should output 1, but if it were to output 1, it should output 0!

If you don't provide an internal input for the machine you are using as input, of course you can't analyze how it's going to run.

21:25 UTC


How to Use a Collaborative Approach to Problem-Solving

Hello there!

This is an article I posted on the original Algorithmically Speaking blog, that also got published on freeCodeCamp.

Here, I share my vision on some aspects of technical interviews for programming jobs based on my own experiences when interviewing for a position at Volvo Cars and later as an interviewer myself.

Hope you enjoy it!

Disclaimer: all the code examples are written in Python but I don't think you will need a deep understanding of the language to get the most out of this article.

Read it here: How to Use a Collaborative Approach to Problem-Solving

Let me know your thoughts in the comments.

13:27 UTC


Practical computation with non-Vonn Neumann machines?

Right now we do nearly 100% of our computation with Vonn Neumann machines. Occasionally we use them to emulate other forms of computation like neural networks.

I'm really interested in unconventional forms of computation. It looks like there's an infinite zoo of computational complexity out there - broad categories like cellular automata, neural networks, register machines, etc with endless variations within.

But the tricky part is doing practical computation. Most of these systems are highly chaotic and impossible to program by hand. Neural networks are widely "programmed" through optimization, and this can be extended somewhat to cellular automata.

Are there any examples of techniques to program other nonstandard computational systems? Bonus points if they're physical hardware instead of emulations.

19:22 UTC


Good papers for high school students?

This summer I am hosting an internship for 5 high school students (recent graduates through rising juniors) who have taken up to Computer Science AP, and may have taken cybersecurity, participated in extracurricular events, and so on.

I'm already planning on introducing them to the material from MIT's "The Missing Semester of Your Computer Science Education".

I'm looking for suggestions on good 'toe dipping' computer science papers in order to expose them to the idea that "this stuff is approachable if you work at it" early, and I'm looking for recommendations.

I've been considering stuff from Knuth's books like "Selected Papers on Fun and Games". Any specific suggestions from this audience?

14:47 UTC


Judy Array, are they still a viable choice?

So I've been reading the memcached initial commit and notice their in memory storage is a Judy array. I can't really find a lot of info on this data structure besides the developers blog posts from almost 20 years ago. Wikipedia says it's needlessly complicated? Seems like skiplists and other bsp trees have similar use cases as well.

Is there any reason why I haven't seen this data structure more in modern code? Maybe I'm just ignorant?

If anybody is an old school open source dev I'd love to know what the Judy array was succeeded by.

13:04 UTC


Developer communities London UK


TLDR: Does anyone in this community have any recommendations for groups or communities that meet in-person in london to work on projects together, learn together and network?

Background: I’ve recently made the decision to try to pivot my career from a somewhat related role towards becoming a software dev. I’m doing a lot of personal learning to plug any knowledge/ coding gaps and I’m creating a portfolio of projects for interviews etc. However, one thing I feel I’m missing is the opportunity to work with other like minded individuals, learn together, network, meet potential employers, understand what it is that I don’t know etc. - and so I’m looking to find a community that meets in person fairly regularly and is set up to help with the above.

Thanks in advance for any recommendations!

1 Comment
10:27 UTC


Theoretical Computational Model Stronger than turing machine

Suppose two things: Time is infinite in the future, we can make a time machine. Now imagine a machine that takes any question as input and runs on it just like a turing machine. If the turing machine halts in any point in the future it has the ability to go back in time to a exactly a second after you pushed the calculate button. If you see a machine that came back in time after a second then the program halts. If you don't see a machine then it doesn't halt.

What do you think?

05:30 UTC


If we describe the size of combinatorics problem as 10^N, where is the solvability boundary in the case of CPU / GPU / Quantum Computing? (say, we have 1month)

In 2015, I saw Google's quantum computing is 100M times faster than previous GPU super computer. Maybe it is hard to say "solvability boundary" without defining the problem solving period clearly.

23:15 UTC


Ar/Vr development

Hey! I was trying to get into Ar/Vr Development (Software). Could someone share some resources as to how can I get into this field. Tutorials, tips anything works

20:22 UTC


Can Blockchain replace Cloud

Hey, I am a student of CS and have really been pondering about the newer techs emerging, I have been very interested in cloud and am also pursuing Architect cert from Azure, but all the hype around has been a concern that if blockchain will replace cloud. I am new to all this as I told I just am a student rn, I am eager to know if this scenario could ever happen bcoz rn I have time to switch over to blockchain(I like CS as whole not just cloud). I am really looking for some guidance. So, just wanted to know yall folks opinions. Thank You!!

16:23 UTC


What kind of text-file format is the best for specifying trees.


I'm playing with the thought of writing a program for rendering trees.

In the first instances, they are just rendered in the command line window as a regular tree.

I want to just make a text file to be parsed as a tree, with the bare minimum of fuzz, so it is as easy as possible to define a tree, but with as little amount of ambiguity as possible. And it should be easy to edit the definition, add delete nodes and so on.

I have thought of using markdown headings for speciffying levels, but thinks that even that is a little too much.

As I see it I have two alternatives that are viable:

  • tab indentation

    root level1 level 2 level1 level 2 And so on

  • markdown headers, much like the first one.

I think, having to make the user write it as a tree is implemented in an array is too much.

I wonder if I am overlooking some other options for defining a tree by a text file, which is just as easy. (I have considered numbering, but thinks renumbering will be too big a chore.)


I want a file format as simple as possible for being able to parse a text file as an inorder binary tree, with as little markup as humanly possible, and yet leave as little room as possible for ambiguities.

The motivation behind this, is to lower the threshold for drawing a graph down to nothing. The end goal, is to output pic scripts you can edit further, and then have pic render the diagrams as svg or whatever format available.

So, thank you for any suggestions!

13:53 UTC


What is the best reference or dictionary sites with definitions in computer science

What is the best reference or dictionary sites with definitions in computer science like wikipedia or FOLDOC

13:28 UTC


What's the difference between Model Formulation and methodology in a computer science paper

Struggling to find the difference, feels like methodology isn't really applicable to the field of CS. Am I wrong?

12:24 UTC


Data Retrain in SAS Viya

Hi, everyone. i am new to this area and have a question over data retraining.

When I am trying to use SAS Viya to manage new models (such as credit scoring), I am confused about what are the ways to select retraining datasets. For example, when evaluating the performances on the tested dataset (Q1,Q2,Q3,Q4,Q5) by looking at the ROC graph Q2 and Q5 are doing the best, and there are certain variables showing great differences across the quarters which requires retraining our model on other data. I wonder how do we choose the data we should use to retrain then? What are the standards or ways to select the data we use? It will be great If anyone can help me with this question, thank you! ( this is the link for the tutorial I am looking at, https://www.youtube.com/watch?v=IGW77r-KQDc , in their example, we will using the most recent data Q5’s data to retrain as the model is degrading over time, but if the the degrading is not following a timely order? )

1 Comment
11:33 UTC


What does the market for file compression look like these days ?

I don’t know if this is the right sub, but I was wondering if the market for compression is booming right now. I’m phrasing it poorly but how much do you think a new and improved file compression system might be worth ?

22:19 UTC


Artificial Image Research - be part of something big!

Hi r/computerscience,

I'm writing a research paper on creating artificial images and I need your help!

I'm researching the effectiveness of Human Eye Perceptual Evaluation (HYPE) which is basically using human vision as a metric of evaluating how good a piece of AI generated art really is.

In particular, I need the help of redditors in answering this form to bump up the sample size before I can get permission to conduct official research: https://forms.office.com/e/YhBxsA6Lej

There are multiple questions but they are quick fire multiple choice (i.e. real or fake) so thank you for your time!

18:09 UTC


how do computers or any digital device actually work?

like I understand, logical gates are setup to perform corresponding opetations and i/p and o/p is given as voltage (high/low). so how does, these logical operations make a word doc, web browsing work?

07:29 UTC


Introduction to Quantum Computing with IBM Quantum

I found this program today. Quantum computing can potentially change our world, applications include revolutionizing cybersecurity. This appears to be a virtual program and you can apply for scholarships. Otherwise it could cost up to $1.2K for both semesters.

I signed up, looking forward to not only learning a useful skill but network with potetentially future millionaires.


01:59 UTC


What book should I read first on computer science?

I need to find something basic and understanding.

23:47 UTC


Interview with Andrew Ng and Chelsea Finn: AI & Robotics

21:03 UTC



I am struggling with calculus and was wondering how much will we be using calculus in Computer Science.

17:00 UTC

Back To Top