/r/compsci

Photograph via snooOG

Computer Science Theory and Application. We share and discuss any content that computer scientists find interesting. People from all walks of life welcome, including hackers, hobbyists, professionals, and academics.

Welcome Computer Science researchers, students, professionals, and enthusiasts!

We share and discuss content that computer scientists find interesting.

Guidelines

Self-posts and Q&A threads are welcome, but we prefer high quality posts focused directly on graduate level CS material. We discourage most posts about introductory material, how to study CS, or about careers. For those topics, please consider one of the subreddits in the sidebar instead.

Want to study CS or learn programming?

Read the original free Structure and Interpretation of Computer Programs (or see the Online conversion of SICP )

Related subreddits

Other topics are likely better suited for:

Other online communities:

If you are new to Computer Science please read our FAQ before posting. A list of book recommendations from our community for various topics can be found here.

/r/compsci

3,615,749 Subscribers

0

Blogs and Websites

It's been a while and I would like you to name at least one related blog you are regularly visiting. Optional, why this is interesting with some keywords.

Starting with

https://news.ycombinator.com/

https://pete.akeo.ie/ - Security Researcher

https://avikdas.com/ - Computer Scientist 3D and dynamic programming as well as scalability and other stuff

https://karpathy.ai/ - Machine Learning

2 Comments
2024/10/17
10:47 UTC

0

Is there a thing like 100% anonymity on the internett? Is onion routing and stuff like that fully anonymous?

13 Comments
2024/10/16
21:47 UTC

0

Syntax can be specified with a meta-syntax called BNF. But what is the meta-meta-syntax defining BNF? And the meta-meta-meta syntax describing that meta-meta-syntax, and so on?

Hi guys, sorry if this seems a stupid question, I was going through this part in Crafting Interpreters

, and I came across this side note:

Yes, we need to define a syntax to use for the rules that define our syntax. Should we specify that metasyntax too? What notation do we use for it? It’s languages all the way down!

But this will lead to an infinite recursion of sorts by defining each meta^n language using a meta^(n+1) language. I read on Wikipedia that BNF can be used to describe its own syntax, is that why we don't have this infinite recursion in practice?

6 Comments
2024/10/16
13:26 UTC

0

[R] Your neural network doesn't know what it doesn't know

0 Comments
2024/10/16
03:17 UTC

9

What is the difference between Conference Papers, Reviews, Literature, and Literature Review Papers in Computer Science?

Where can I publish any of those papers?

14 Comments
2024/10/15
15:34 UTC

3

Looking for semi-advanced resources about codecs

Hi guys,

im looking for resources explaining the inner workings of the following video codecs: H264, H265, VP9, AV1, VVC.

I need something more detailed than the articles you can find by googling "H264 technical explanation", i understand the concepts of i/p-frames, DCT, transform blocks etc. (It doesnt help that many of the articles seem copy/pasted or generated by AI, or just cover how much bandwith do codecs save).

However the documentation for said codecs is really overwhelming (H264 ITU-T has 844 pages), im looking for something in between in terms of technical depth.

Thanks for all replies, it can be just about one of the codecs listed above.

1 Comment
2024/10/15
09:54 UTC

0

Advice on Algorithm Choice for Combinatorial Optimization Problem

Hello Everyone, I have a question regarding the choice of algorithm to solve my combinatorial optimization problem i am facing. Sorry for the long post, as I want to describe my problem as clearly as possible.

I am trying to solve a combinatorial optimization problem, I don't have the exact number of parameters yet, but the estimate is around 15 to 20 parameters. Each parameter can have anywhere between 2-4 valid options (a major chunk ~50% might have 2 valid options). The major problem that I am facing is that the cost evaluation for each solution is very expensive, hence I am only able to perform a total of 100 - 125 evaluations. (since I have access to a cluster, i can parallelize 20 - 25 of the calculations). Given the nature of my problem I am okay to not land on the global maxima/ the combination that leads to least value of my cost function, a result that is a good improvement over the solution that I currently have is a win for me (if miraculously I can find the global maxima then that solution is of course favored over others, even if it leads a little higher compute time). I don't want to color the reader with my opinion, however the current idea is to use a genetic algorithm with 20-25 population size and 4-5 generations, with a tournament selector, with a mutation rate on the higher end to ensure the exploration of the search space. (the exact figures/parameters for genetic algorithm are not decided yet -- I am quite inexperienced in this field so is there a way to actually come up with these numbers).

If there are any folks who have experience in dealing with combinatorial optimization problems, I would love to hear your thoughts on the use of genetic algorithm to solve this or if they would like to point me / educate me on use of any other alternate algorithms suited for the above described problem. I am using a smaller/toy version of my original problem so I do have some amount of freedom to experiment with different algorithm choices and their parameters.

Ps:- From my understanding simulated annealing is inherently a non-parallelizable algorithm, therefore I have eliminated it. Also this is my first time dealing with problems of massive scale as this, so any advice is appreciated.

Pps:- I cant divulge more details of the problem as they are confidential. Thanks for understanding

2 Comments
2024/10/15
01:27 UTC

37

I think I found my "passion" but I can't imagine working in academia.

I've recently found that I really enjoy theoretical computer science even though my degree is more like an applied mathematics degree. I love working on advanced algorithms and really enjoy things like complexity theory and I'm planning to take other theoretical classes soon line graph theory, advanced algorithms and maybe even cryptography. I want to focus the rest of my degree on theoretical computer science and either get a CS masters and focus on theory or a mathematics masters with a focus on discrete maths/ computer science. I'm only in my second year so I really haven't paid attention the job market so I have no idea what kind of jobs there are out there.

Most jobs I hear related to computer science are either:

  1. Software engineer/developer: sounds like a nightmare to me. I actually don't like coding that much. I enjoy the algorithmic problem solving part and coding is just a tool for me to work on problems I enjoy. I know people who work as software engineers and it just sounds like a boring desk job.

  2. Data scientist: I don't might probability theory but I don't like statistics (idk if that makes sense lol) and from what I've seen from machine learning doesn't really excite me in any ways really.

  3. Jobs in IT, web development etc which all sound kinda tedious to me.

Now a lot of people will probably suggest a PhD and going to academia. Even though I think I'd consider getting a PhD, I just can't see myself working in academia. It's more of a personality thing really. I don't see myself fitting into that type of environment. My ideal job is some research position out in the industry which is heavily theoretical, somewhere in between mathematics and computer science. I just don't know if that exists. Do you have any advice? Is there any of you work on theoretical computer science outside of academia? I would appreciate any advice and sorry for the long rant I'm just kind of lost at the moment.

34 Comments
2024/10/14
20:38 UTC

25

What's your favourite Algorithm (s) ?? Mine Is Public key Algorithms, seems magical.

43 Comments
2024/10/14
14:02 UTC

13

New video on compiler system design

Hey everyone, I posted here a few weeks ago about the start of my YouTube channel on the llvm and compilers. I just uploaded a new video on compiler system design, I hope you all enjoy it! https://youtu.be/hCaBjH5cV5Q?si=njm0iA0h_vBz0MFO

0 Comments
2024/10/13
21:32 UTC

0

Exploring Concept Activation Vectors: Steering LLMs’ Behavior in Multiple Domains

I’m excited to share our recent work on steering large language models using Concept Activation Vectors (CAVs). This technique allows us to adjust the behavior of LLMs to act like domain experts (like Python or French) and even manipulate their refusal and language-switching capabilities. If you’re into AI interpretability or LLM safety, you might find our experiments and findings intriguing.

📄 Highlights:

  • Real-world examples, including generating Python code and switching between English and French.
  • Discussions on LLM behavior steering, safety, and multilingual models.
  • Insights into the future potential of CAVs in replacing system prompts and improving model alignment.

We’ve already expanded on the safety concept activation vector (SCAV) idea introduced earlier this year and observed some cool (and strange) phenomena, especially around language and task steering.

💡 Interested in how this works? Check out our full write-up on LessWrong. Would love your thoughts and feedback!

0 Comments
2024/10/12
14:18 UTC

0

T-Test Explained

Hi there,

I've created a video here where I talk about the t-test, a statistical method used to determine if there is a significant difference between the means of two groups

I hope it may be of use to some of you out there. Feedback is more than welcomed! :)

2 Comments
2024/10/12
13:23 UTC

54

What's next for Computer Science?

I'm currently in university studying computer science, and I've found myself thinking a lot about where the field of CS is going to go. The last few decades have seen basically exponential growth in computers and technology, and we're still seeing rapid development of new applications.

I have this irrational worry that I keep coming back to: when, if ever, will we see CS start to plateau? I know this is incredibly short-sighted of me and is because I just don't know enough about the field yet to imagine what comes next.

Which is why I'm asking here, I guess. Especially when we're constantly listening to thousands of voices about AI/LLMs and whether they will be the unraveling of software engineering (personally, I don't think it's all doom and gloom, but there are certainly times when the loudest voices get to you), I guess I'm trying to look for areas in Computer Science that will continue to see effort poured into them or nascent fields that have the potential to grow further over the course of my career. I'd appreciate some answers beyond AI/ML, because I know that's the hottest new thing right now.

I know I've rambled a bit in the post, so thank you in advance if you've read this far and even more so if you answer!

45 Comments
2024/10/11
20:23 UTC

0

Need an open source graph database for KG

Hi everyone, I am working on a building a Knowledge Graph and for that I am want to store data in a database with either Apache 2, BSD 3 Clause, or MIT License. I also want to store some extra metadata with the nodes and edges. Currently I have Janus graph, Dgraph and Memgraph in mind. Please suggest me which one I should choose. Keep in mind, that I would like to make this to the production as well. Thanks a lot.

0 Comments
2024/10/11
12:02 UTC

0

Any resource that has hard theoretical problems for data structures and algorithms?

Aside from clrs

8 Comments
2024/10/11
02:22 UTC

2

is creating a low latency kernel bypass framework doable and worth it as my masters graduation project?

Hello everyone, I'm a master's student, soon to become a computer engineer. After a long journey searching for the right project idea for my degree, I knew I wanted to focus on something related to operating systems, low-level programming, or networking. However, I was unsure about the exact direction, especially since I now lean more toward software-oriented work. Recently, I came across an interesting theme: "Low-Latency Kernel Bypass Framework for High-Performance Networking." I'm considering pursuing this idea, but I have a few concerns. Is it feasible to complete within a one-year period? Also, would this project be a case of reinventing the wheel, given that some existing tools already perform similar tasks? if you have better project ideas please feel free to share them here! THANK YOU!!

8 Comments
2024/10/10
16:23 UTC

0

The Role of Expertise in Human-AI Collaboration

0 Comments
2024/10/10
16:02 UTC

0

Need Insights: What Topics Do LLMs Struggle With the Most?

https://preview.redd.it/wgauprzp4rtd1.png?width=1005&format=png&auto=webp&s=732c2040cebea7c5e45dd8ab3ea28107f1d2ace5

hello guys I'm working on RAG Architecture for my CS Grad Project

so I want to know what is the most topics or fields that ChatGPT or popular LLMs are very bad in it and can't give accurate answers

5 Comments
2024/10/09
15:47 UTC

0

Are there Coding/LT books that uses Neovim?

0 Comments
2024/10/09
14:22 UTC

3

Learning Operational Semantics

Hi,

I'm learning it from the following public e-book (Principles of Programming Languages, by Smith, Palmer and Grant):
http://pl.cs.jhu.edu/pl/book

But, I'd like to read and learn more from different sources.
Recommendations?

Thanks!

2 Comments
2024/10/09
12:06 UTC

0

What are the problems associated with Logistic Regression?

The area where we can do research and try to solve the problems of the Logistic Regression?

5 Comments
2024/10/09
08:52 UTC

0

Are programming books overrated?

To start off none of my friends who program have ever read a book, they used courses such, as data camp, or codecamp, none of them read books. But then I thought how could a book be even close to something like data camp. I mean data camp is so much more hands on than books, gives really good examples, and has quizzes.

30 Comments
2024/10/09
03:09 UTC

88

I designed a simple 8-bit CPU called Flip01

Hi!

It’s a small 8-bit CPU with a 16-bit address bus, and you can find it on GitHub (here's a quick overview).
I’d love to get your feedback, whether it’s advice on how to improve it or even some critiques!

Thanks a lot!

https://preview.redd.it/7mjfj0c8bmtd1.jpg?width=1543&format=pjpg&auto=webp&s=20e389c686f4cab0ce53b79c7afad477f8500ea4

53 Comments
2024/10/08
23:34 UTC

4

Virtualization vs Cgroups & Namespaces

I usually see that explanations about containers mention that traditionally IT teams hosted Virtual Machines, however Docker containers simplified processes using cgroups & namespaces. I still understand how Docker simplifies processes but why people used to create VMs instead manually creating namespaces & cgroups for each application ?

3 Comments
2024/10/08
21:50 UTC

0

Would you rather prove P != NP or make an indie game like Baba is You? And why?

17 Comments
2024/10/08
19:10 UTC

6

Are there any video games that take you through software design/architecture?

I'm looking to learn more about systems design and software design. Things like event driven architecture and AWS features like SQS, SNS, Lambdas, Step functions, etc. There are plenty of books but I don't know which are actually good and they're all a bit dry. I'm wondering if there are any alternatives, like games, that would be more interesting while still being informative/useful.

40 Comments
2024/10/08
18:46 UTC

0

Is the reality aware of abstractions?

I'm writing this computer science course on abstractions where we start with the question: Are you a bunch of cells, atoms, or a human - or all of the above?

The idea is to show that we use abstractions to manage complex systems. This is possible in math (where we have a line as an abstraction of multiple points and a plane as an abstraction of multiple lines) and the same is the case with computer science.

I was curious whether reality is aware of these abstractions or if it operates at a very fundamental level. There is this theory that everything is based on computation, even in the real world. So I was just curious does reality operate on some abstractions or that's just how we observe reality?

24 Comments
2024/10/08
04:49 UTC

5

Some questions about instruction size relating to CPU word size

I started watching Ben Eater's breadboard computer series, where he builds an 8-bit computer from scratch. When it came to instructions, because the word size is 8 bits, the instruction was divided between 4 bits for the opcode and 4 bits for the operand (an address for example). For example, LDA 14 loads the value of memory address 14 into register A. I did some research and saw that are architectures with fixed size instructions (like early MIPS I believe), and in those architectures it would not always be possible to address the whole memory in one instruction because you need to leave some space for the opcode and some other reserved bits. In that situation, you would need to load the address into a register in two steps and then reference that register. Did I understood this part correctly?

In more modern architectures, instructions may not be fixed size, and in x64 an instruction could be 15 bytes long. I'm trying to wrap my head around how this would work considering the fetch-decode-execution cycle. Coming back to the 8-bit computer, we are able to fetch a whole instruction in one clock cycle because the whole instruction fits in 8 bits, but how would this work in x64 when the word size is 64 bits but the instruction can be much bigger than that?

These questions seem easy to just Google but I wasn't able to find a satisfying answer.

19 Comments
2024/10/07
15:03 UTC

0

Can Some of You share ur custom cpu/gpu architeture designed in verilog or vhdl / logic.ly

PLSSSSS

0 Comments
2024/10/07
13:59 UTC

Back To Top