Photograph via snooOG

Computer Science Theory and Application. We share and discuss any content that computer scientists find interesting. People from all walks of life welcome, including hackers, hobbyists, professionals, and academics.

Welcome Computer Science researchers, students, professionals, and enthusiasts!

We share and discuss content that computer scientists find interesting.


Self-posts and Q&A threads are welcome, but we prefer high quality posts focused directly on graduate level CS material. We discourage most posts about introductory material, how to study CS, or about careers. For those topics, please consider one of the subreddits in the sidebar instead.

Want to study CS or learn programming?

Read the original free Structure and Interpretation of Computer Programs (or see the Online conversion of SICP )

Related subreddits

Other topics are likely better suited for:

Other online communities:

If you are new to Computer Science please read our FAQ before posting. A list of book recommendations from our community for various topics can be found here.


2,908,167 Subscribers


Can an AI perform calculations using a logic machine after understanding the meaning?

Instead of thinking and response in a neural black box, can an AI perform calculations using a logic machine after understanding the meaning?

17:03 UTC


Algorithm to find a solution that Satisfies different conditions at the same time

I need to create an algorithm to crop an image with dimensions that satisfies couple of different conditions at the same time. For example the eye level has to be 60-70% of the image's height and the head should occupy 40-50% of the image height while keeping 1:1 ratio and keeping the resolution above 1000px.

(The eye level and face dimensions are already calculated)

What kind of algorithm could achieve the solution efficiently if one existed?

I tried bruteforcing (60% eye level + 40 face height, 61% eye level + 41 face height, ....) But as you could imagine it's very slow I need some help

05:13 UTC


Operating Systems Question

This was a Question In my University's Operating Systems Courses that there was a relative amount of disagreement in the correct answer for the question. I am curious on what you all think of the correct answer?
Which of the following features is not necessarily a "standard" feature for most of today's OSs?

  1. Be heavily involved when an I?O device needs to communicate with RAM.
  2. Manage all hardware resources.
  3. Monitor Malware.
  4. Provide a uniform user interface
20:28 UTC


Advice for a newbie(API and animation in apps mainly)

It has been 15 days since I have started learning app Dev and I already feel lost. There is just soooo much in this field that I already feel lost. When I try to not copy the tutorials and implement a few ideas by myself I encounter a pile of problems that I just can't get my head around. So if you know about some course,some websites,YT channels do suggest them that will help me get the hang of everything. Currently I am interested in sexy fluid animation in my apps ,integrating API's and these are the concepts that are so scattered on the internet that it takes time to find. If you have these resources that can help do share them

08:41 UTC


Humanity vs. Highlighters: Winning the Student Sleep Battle

07:00 UTC


Need fullstack project ideas.

Need some new project ideas for a new fullstack project. If you have any ideas, let me know

15:54 UTC


Is there any field in computer science that changed the way you think and enhanced your understanding of how the universe works?

15:41 UTC


Cross-Validation Explained

Hi there,

I've created a video here where I explain how cross-validation works and why it is useful.

I hope it may be of use to some of you out there. Feedback is more than welcomed! :)

14:08 UTC


Permutation Functions Analysis

There is a problem of generating a random permutation of [1..N], for simplicity N is a power of 2. One way is to use a permutation function F_key(x), that depends on a key and generate a permutation either in the recurrent form {F_key(seed), F_key(seed)^2, F_key(seed)^3 … } in case of LCGs, LFSRs and such, or in the form {F_key(0), F_key(1), F_key(2) …} using various cryptographic primitives, w/o storing a whole permutation in memory on the one hand, and have a random key as an identifier to pass around on the other.

If we fix a specific permutation function F, then iterating over all possible keys, with every F_key generating a single permutation, gives us a subgroup of a group of all permutations S_n. I'd like to discuss the landscape, the pros and cons. Is there theoretical analysis among different families, such as polynomial arithmetic based functions, substitutions, xorshifts and various others, that can answer following questions:

  • How big a generated subgroup is? Is there a family that is capable to generate A\_n or even whole S\_n even if it requires arbitrary big keys?
  • Permutation functions that generate different permutations for every key called ideal. Are there any ideal permutation functions capable of producing big enough subgroups or even S\_n itself, i.e. “ideal ideal” functions?
  • What about elasticity of permutation function in terms of size? Usual approach for a random permutation is to use one of standard block ciphers, but they come in fixed sizes, such as 128 bits, etc. How properties above suffer if we consider variable size permutation functions at expense of not being cryptographically secure?

So, I want to discuss is is there definite answers or it’s still an ongoing research, hoping to find a family that is easy to adjust for a specific length N, that generates a big enough subgroup of S_n with “most” of the keys producing different permutations. I think that can be useful for others as well.

Thanks for your input.

07:18 UTC


NYC LeetCode Study Group

22:04 UTC


When I zoom in or out of an image set as the MS Edge background for a new tab, I notice that the inconsistencies in the noise grain become more pronounced. I'm curious about how the OS renders the image in this way and would like to understand this phenomenon in-depth.

14:03 UTC


why do PCs typically have 2^n for anything data related

like we can still project any number really (1,10,11,100,101,111,1000 etc.) so how come PCs only ever use 2^n?

12:45 UTC


Why does the Software Engineering degree exist?

This is probably a bit off topic but was not sure where else to post.

Here in Aus it seems like the only difference between Software Engineering and CompSci degrees is that you spend a year studying random engineering things. So why does this degree exist?

My best guess would be that historically computer development was an engineering area and that the idea of a "programmer" or "computer scientist" was not a thing until later. Is this right?

Edit: just as a little note, I was not throwing shade at Software Engineering or Software Engineers. My question stems from the two universities I have attended here in Aus: QUT and Deakin. At both the SE degree is just the first year of an Engineering degree and then a copy of the IT/CS degree.

I now know that SE does specialise in different stuff than IT and CS.

01:14 UTC


Questions about computational Neuroscience

Hello. I’m a grad student who got into MS CS at UMIch and JHU for fall 24

I find computational neuroscience very interesting, but it’s relatively new to me. UMIch AA is ranked highly for CS whereas JHU is ranked higher for neuroscience. Consider this, which University would you prefer for Computational Neuroscience? And why?

Another question is, how good is this concentration? I understand it’s highly research oriented, but I am not inclined towards PhD after graduation. I would like to get an AI/ML engineer(since it’s closely related) or Research scientist or Applied Scientist roles. I will be taking AI and ML courses. In addition to that I thought I’d get into computational Neuroscience concentration. What are the job opportunities I can expect with this? And is it worth it? Will it open doors to AI opportunities at big techs too?

23:39 UTC


P4 code migration from IPv4 to IPv6

I was recently working on a project which was written on P4 open source programming language having a working example of a IPv4 router. The code was working fine. However i need some help to modify it to route IPv6 and filter TCP/UDP packets if they do not have a specific port. I'll be adding the screenshots of the working P4 code written for Ipv4.





17:42 UTC


How Apple Uses ML To Recognize Different Individuals Across Photos. A 5-minute visual guide. 📱

TL;DR: Embedding models pre-trained using contrastive learning. Hierarchical clustering is used to carve the embedding space to recognize different individuals.

Here is a visual guide covering the technical details: How Apple Uses ML



13:42 UTC


Fast inpainting models

What is the fastest model architecture that supports inpainting/outpainting with reasonable quality?

Does anyone know if there is an inpainting/outpainting pipeline with SDXL Turbo?

04:21 UTC


Is it possible to boost Ilm’s short term memory from freely fetching enlarged(entire ram) and nodified kv-cache?

Is it possible to recreate a logical short-term memory (just like human's one that takes parts in major intelligence workloads) from recording every single neuron kv from AI like a log then retrieving it by fetching?

Artificial intelligence's memory capacity shouldn't be naturally how neurons react to recreate the reflex that occurred before. It should be managed by local memory unit(hence could be read and write) because memory unit's unique and perfect attributes from memorizing things(ie will not forget things, fast to retrieve/fetch, High reading speed). Using memory as logical memory for AI could significatly improve reasoning intensity/speed etc.

19:59 UTC


Tau’s CTO Explores Decentralized AI at Today’s AI Roundtable - Join Us at 4PM UTC

Greetings /r/compsci,

For those with a keen interest in the convergence of computer science and advanced artificial intelligence technologies, today’s AI Roundtable Twitter Space event is not to be missed. Tau's CTO, Ohad Asor, will join a select group of AI experts to discuss the future and implications of decentralized AI systems.

Why This is Important:

  • Technical Depth: Dive into the complex technicalities of AI and how decentralization could redefine its frameworks and applications.
  • Future Directions: Understand the trajectory of AI research and the potential for decentralized systems to influence future innovations in computer science.

Event Details:

  • Date: Today, April 11th
  • Time: 4 PM UTC
  • Location: The Roundtable Show on Twitter Spaces
  • Access Link: Click here to be part of the conversation.

This session promises rich discussions on the theoretical and practical aspects of decentralized AI, offering valuable insights for students, researchers, and professionals in computer science.

Share this with anyone passionate about the future of compsci and AI. Your engagement can help shape the conversation around these pivotal technologies.

Hope to see you there!

Cheers, The Tau Team

1 Comment
13:41 UTC


How to reason about synchronization

In my experience, people don't learn about synchronization at school and it's often seen as an advanced topic of sorts mostly for people doing multithreaded work. Even engineers working with MT don't really think that much about it, with luck they will care about locking/unlocking the mutexes in the right order. But it's not just MT, synchronization problems are everywhere, I just encountered one incorrect behaviour because dbus messages were exchanged in a different order than expected; and a friend just told me that this is a common problem in microservices architectures.

How to reason about synchronization? I found about order theory, is that a good framework to model such problems or is something else needed?

11:19 UTC


Stuck in learning DSA, seeking some advice...

So, I started DSA in C++. I stumbled upon this a course on DSA(from GFG, a friend gave it to me, yes pirated, i mean its too costly), and its long. Now, when it came to web development, I was quite the note-taking pro. But It's a whole different story when tackling DSA.

Videos just 7-10 minutes long, is taking me a solid hour or two to digest fully! I've tried speeding up the lectures to 1.5x, but here's the kicker: I end up pausing every few seconds just to jot down notes. It's like a never-ending cycle of rewind, pause, write, repeat. And I get it, notes are essential, but man, it's slowing me down big time.

08:57 UTC


Algorithm Complexity

I am a undergraduate student in statistics and mathematics and recently I saw a problem where u have to prove that given algo A no better algo B exist that is algo A has less time complexity than algo B for all algo B. But how does one goes to prove something like this. I don't have any idea. In maths whenever u have to proof such a thing no exist u basically use proof by contradiction. But how to solve such a problem in algo. If u can refer any paper where such a thing has been proven it will be useful or any method or approach comp sci generally use to tackle this type of problem.

08:34 UTC


Simple and effective algorithm for constant state detection in time series

1 Comment
07:50 UTC


automathon: A Python library for simulating and visualizing finite automata

06:45 UTC


Bachelor's Thesis about data in Motorsport

I am a last year student of an engineering degree, and I must do a bachelor's Thesis. While I am not betting my whole future in getting a job in motorsport, it has been a dream for me for quite a while, mainly in formula or endurance. So, I was thinking about doing something related to this field. My idea was to do something about data analysis and use that for a prediction system or something along the line. Any ideas? Any help is appreciated!

20:26 UTC


Executable format used in TempleOS and design considerations for executable format

Hi guys,

I recently watched a video clip of Terry Davis where he mentioned that TempleOS doesn't use common executable formats like ELF or PE. Which format does it use and what information is present in that format (symbol tables, etc.)?

Also, are there any guidelines for executable format design?

09:46 UTC

Back To Top