/r/compsci

Photograph via snooOG

Computer Science Theory and Application. We share and discuss any content that computer scientists find interesting. People from all walks of life welcome, including hackers, hobbyists, professionals, and academics.

Welcome Computer Science researchers, students, professionals, and enthusiasts!

We share and discuss content that computer scientists find interesting.

Guidelines

Self-posts and Q&A threads are welcome, but we prefer high quality posts focused directly on graduate level CS material. We discourage most posts about introductory material, how to study CS, or about careers. For those topics, please consider one of the subreddits in the sidebar instead.

Want to study CS or learn programming?

Read the original free Structure and Interpretation of Computer Programs (or see the Online conversion of SICP )

Related subreddits

Other topics are likely better suited for:

Other online communities:

If you are new to Computer Science please read our FAQ before posting. A list of book recommendations from our community for various topics can be found here.

/r/compsci

3,810,961 Subscribers

7

Excellent free course on Model Checking

I have been recently interested in developing my skills in model checking. Doing some research on YouTube, I found this lecture series and the associcated website for the course. I have watched the first lecture now and it seems fantastic.

Video playlist: https://www.youtube.com/playlist?list=PLwabKnOFhE38C0o6z_bhlF_uOUlblDTjh

Course site: https://moves.rwth-aachen.de/teaching/ss-18/introduction-to-model-checking/

2 Comments
2024/12/05
23:48 UTC

0

Do I further my learning in Python with new libraries, projects, etc or start a new language?

So I’ve been learning quite a bit of python already. I am now confident in my abilities to do some fundamentals of python like control structures, functions, module and libraries, file and error handling, lists/ dictionary and also a solid understanding of some libraries like date time and Pandas. I did create a small project in which I programmed a to-do-list app, it’s not the best coding but it works. Problem is now I’m not sure where to head from here, I’m about to finish my first semester of freshman year, and I want to say I’m ready to move on from Python. But at the same time I know Python is capable way more complex applications than a simple to-do list, (ex. The stuff developers code for companies).

3 Comments
2024/12/05
22:46 UTC

0

Next revolutionary idea

We’ve gone through many technological revolutions, from transistors to the Internet to AI. These ideas fundamentally change the game of how we think about the world, and how technology interacts with it. What do you think could be the next revolutionary idea and why?

9 Comments
2024/12/05
19:31 UTC

0

Seeking Guidance on Cryptography

Hi everyone,

I recently started my CS major at a small institute in India. For my first-semester project, I decided to create an encryption tool in C using Caesar cipher and a random number generator. While working on this project, I explored different encryption techniques and the mathematics behind them. This deep dive made me realize that cryptography is a field I'm very passionate about, especially the mathematical aspects.

However, there's a problem: I have very few resources and no one to guide me. The professors at my college are not very open to helping students, and no one seems willing to answer questions. On top of that, every guide I come across online only seems to confuse me more.

That’s why I’m reaching out to this community for help. Could anyone provide a clear path or suggest specific topics/books to focus on, particularly in the mathematical side of cryptography? Even a small reference would be incredibly helpful.

Thank you in advance!

6 Comments
2024/12/05
16:55 UTC

4

Need some help/suggestions for getting into research

I'm a Computer Science student and i want to get into research. I'm having some trouble starting out.

I'm passionate about theoretical stuff mostly, especially in machine learning or artificial intelligence.

Does anyone have any suggestions of some kind of programs for students or anything like that? Or is it better to just start working on a paper and if that's the case what's the best way to start? Thanks!

4 Comments
2024/12/04
11:33 UTC

5

What were the commonly seen or more influential data structures/algos textbooks by decade

I'm trying to work out what algorithms textbooks people were using by decades. By the 90s, it was Sedgwick and Cormen commonly seen. IN the 80s, I've seen Rohl and Wirth's book (From the previous decade), and I've ordered a 1st edition 83 sedgewick to compared to my 90s second edition.

What were other folks using in the 80s? HOw about by the 2000s?

10 Comments
2024/12/04
09:16 UTC

0

With the rapid growth of AI/ML and technology, how do you keep up with current trends, models, etc?

My previous career, I would try to keep up with medicine by reviewing peer studies, nurse organization articles, etc.
I want to become more engage with technology and specifically AI. Do you have any suggestions on newfeeds, articles, seminars, etc ?

12 Comments
2024/12/03
21:17 UTC

10

First data structures/algorithms book covering hash tables + when they became common

I've been digging in among some of my old CS books and have noticed a conspicuous absence of everyone's common datastructure the hash table. I was wondering if anyone could help me pingpoint whihc was the first CS text that covered hash tables, and help me get an idea of where they just became ubiquitous and every textbook would cover them

I know they were touched upon in I think the earliest edution of Knuth Vol3, and the original paper laying out some details (mostly hashing on its own) was in the 50s.

8 Comments
2024/12/03
15:06 UTC

7

Design pattern case studies.

Hi, I'm reading Design patterns, elements of object oriented software. Chapter 2 is a case study on designing a document editor, it has been incredibly illuminating. I was wondering, if there exists such a source of design case studies for other software such as media player, image editor and something like MS paint as well. Thank you.

2 Comments
2024/12/02
06:55 UTC

0

There have been many cycles of Intelligence growth and decrease. Will AI lead to another one?

Francis Bacon saw human history as one long, often repetitive cycle of waxing and waning intelligence. In his analysis of history, mankind’s knowledge didn't grow smoothly over time but rather moved through grand revolutions, golden ages where the mind flourished, followed by dark, stagnant periods that erased all progress. The Greeks, the Romans, and then the Renaissance each had their time in the sun, but each was also followed by an era where knowledge hit a plateau or even regressed. Think about the destruction of the Library of Alexandria and the purge of intellectuals. Will Ai lead to another decline? https://onepercentrule.substack.com/p/ai-and-overcoming-the-threat-of-intelligence

19 Comments
2024/11/30
18:23 UTC

34

YouTube Channels similar to Core Dumped

Hi, I've been really loving all CoreDumped videos, especially as someone getting into programming without a college degree.

That channel been invaluable to me and I want more videos like this

Does anyone else have similar suggestions for computer science channels?

2 Comments
2024/11/30
17:11 UTC

0

Making a stopwatch - x16

So im working on a board and trying to make a reaction speed test.

Board im working with has a RTC (Real time clock) From that i can use seconds,hours,minutes.

On the other hand, the board has a free running clock-16-bit 1Mhz.

My approach currently is that im counting clock cycles. That is done by comparing the value of the current clock (free) and the value of the clock when first called. If it is equal then a cycle has completed, CountCycle++ . If it is less than then an overflow occured and clock wrapped back to 0 so CountCycle++.

then i convert CountCycle to ms by dividing the number of clock cycles by 45 (Rough math was fried at this point).

Was debugging the code and the answers (in ms) were not realistic at all. Is the math wrong? Or is my way of counting cycles wrong? Personally i feel it is the latter and i am skipping clock cycles while checking if the button is pressed. If so what suggestions do you have.

Feel free to ask any question I’ll do my best to answer.

5 Comments
2024/11/30
01:18 UTC

57

Why isn’t windows implementing fork?

I was wondering what makes so hard for windows to implement fork. I read somewhere it’s because windows is more thread based than process based.

But what makes it harder to implement copy on write and make the system able to implement a fork?

35 Comments
2024/11/30
00:53 UTC

3

What are your thoughts about Patterns of Distributed Systems book?

I've been searching for similar topics and found this one, but the reviews at GoodReads discouraged me. What do you think? There is another one called Distributed Systems from Maarten van Steen, which has better reviews.

10 Comments
2024/11/29
13:35 UTC

0

The Birth, Adolescence, and Now Awkward Teen Years of AI

These models, no matter how many parameters they boast, can stumble when faced with nuance. They can’t reason beyond the boundaries of statistical correlations. Can they genuinely understand? Can they infer from first principles? When tasked with generating a text, a picture, or an insight, are they merely performing a magic trick, or, as it appears, approximating the complex nuance of human-like creativity?

https://onepercentrule.substack.com/p/the-birth-adolescence-and-now-awkward

8 Comments
2024/11/28
17:08 UTC

8

Find the maximum number of mincuts in a graph

I have to prove that the maximum number if mincuts in a graph is nC2. Now I know Karger's Algorithm has success probability at at least 1/nC2. Now P[sucess of karger's algorithm]=P[Output Cut is Mincut]= (#mincuts)/(#all cuts). Then how then we are getting that bound.

1 Comment
2024/11/27
17:56 UTC

86

I built a Programming Language Using Rust.

Hey Reddit!

I have been working on this project for a long time (almost a year now).

I am 16 years old, and, I built this as a project for my college application (looking to pursue CS)

It is called Tidal, and it is my own programming language written in Rust.

https://tidal.pranavv.site <= You can find everything on this page, including the Github Repo and Documentation, and Downloads.

It is a simple programming language, with a syntax that I like to call - "Javathon" 😅; it resembles a mix between JavaScript and Python.

Please do check it out, and let me know what you think!

(Also, this is not an ad, I want to hear your criticism towards this project; one more thing, if you don't mind, please Star the Github Repo, it will help me with my college application! Thank a Lot! 💖)

34 Comments
2024/11/26
12:19 UTC

1

What are some different patterns/designs for making a program persistent?

Kinda noobish, I know, but most of the stuff I've done has been little utility scripts that execute once and close. Obviously, most programs (Chrome, explorer.exe, Word, Garage Band, Libre Office, etc) keep running until you tell them to close. What are some different approaches to make this happen? I've seen a couple different patterns to make this happen:

Example 1:

int main(){
  while(true){
    doStuff();
    sleep(amount);
  }
  return 0;
}

Example 2:

int main(){
  while(enterLoop()){
    doStuff();
  }
  return 0;
}

Are these essentially the only 2 options to make a program persistent, or are there other patterns too? As I understand it, these are both "event loops". However, by running in a loop like these, the program essentially relies on polling events, rather than directly reacting to them. Is there a way to be event-driven without having to rely on polling for events (i.e. have events pushed to the program)?

I'm assuming a single-threaded program, as I'm trying to just build up my understanding of programming patterns/designs from the ground up (I know that in the past, they relied on emulating multithreaded behavior with a single thread).

3 Comments
2024/11/25
20:16 UTC

1

Thoughts on computer science using higher and higher level programming languages in order to handle more advanced systems?

(Intro) No clue why this started but I’ve seen a lot of overhype on A.I. and YouTubers started making videos now about how CS is now a dead end choice for a career. (I don’t think so since there is a lot happening behind the scenes of any program/ai/automation).

It seems programming and computers overall have been going in this direction since they were built in order to be able to handle more and more complex tasks with more and more ease on the surface level/making it more “human”and logical to operate things.

(Skip to here for main idea)

(Think about how alien ships are often portrayed to be very basic and empty inside when it comes to controls even though the ship itself can defy physics/do crazy cool things, they’re often controlled by very forward and instinctual controls paired with some sort of automation system that they can communicate on or input information that even a kid would understand. This being because if you get to such a high level of technology, there would be too much to keep track of(similar to how we’ve moved past writing in binary or machine code because of how there is too much to keep track of), so we seal those things off and make sure they’re completely break proof in terms of software and hardware then allow pilots who are also often the engineers to monitor what they need using a super simple human/alien design. Being able to change and effect large or small aspects of the complex multilayered system using only a few touches of a button. This is kind of similar to how secure and complex iPhones were when they came out, and how we could do a lot that other phones couldn’t do simply because Apple created a UI that anyone could use and gave them access to a bunch of otherwise complex things at the push of a button. Then we had people who were engineers create an art form from it through jailbreaking/modding these closed complex systems and gave regular people more customization that Apple didn’t originally give. I think the same will happen overall with all of Comp Sci where we will have super complex platforms and programs that can be designed and produced by anyone, not just companies like Apple, but the internals would be somewhat too complex for them to understand and there will be engineers who will be able to go in and edit/monitor these things and even modify certain things and those people will be the new computer scientists while people who actually build programs using the already available advanced platforms we’ve built will be more similar to how companies drawing stuff on boards and making ideas since anyone can do it).

What are your thoughts?

13 Comments
2024/11/25
19:56 UTC

0

Is studying quantum computing useless if you don’t have a quantum computer?

Hey All,

I recently started my Master of AI program, and something caught my attention while planning my first semester: there’s a core option course called Introduction to Quantum Computing. At first, it sounded pretty interesting, but then I started wondering if studying this course is even worth it without access to an actual quantum computer.

I’ll be honest—I don’t fully understand quantum computing. The idea of qubits being 1 and 0 at the same time feels like Schrödinger's cat to me (both dead and alive). It’s fascinating, but also feels super abstract and disconnected from practical applications unless you’re in a very niche field.

Since I’m aiming to specialize in AI, and quantum computing doesn’t seem directly relevant to what I want to do, I’m leaning toward skipping this course. But before I finalize my choice, I’m curious:

Is studying quantum computing actually worth it if you don’t have access to a quantum computer? Or is it just something to file under "cool theoretical knowledge"?

Would love to hear your thoughts, especially if you’ve taken a similar course or work in this area!

38 Comments
2024/11/25
16:52 UTC

8

Is the 4th edition of Computer Networks by Tannenbaum still relevant?

Hi, everyone!
I'm a newbie currently learning data structures and algorithms in C, but my next step would be Network Programming.

I found a used copy of the Tannebaum's Computer Networks (4th Edition) and it's really cheap (8€). But, to me it seems pretty old (2003) so I'm curious to know how relevant is it today and will I miss much if I buy it instead of the 5th edition.

Thanks in advance!

4 Comments
2024/11/25
13:13 UTC

0

Demis Hassabis is claiming that traditional computers, or classical Turing machines, are capable of much more than we previously thought.

He believes that if used correctly, classical systems can be used to model complex systems, including quantum systems. This is because natural phenomena tend to have structures that can be learned by classical machine learning systems. He believes that this method can be used to search possibilities efficiently, potentially getting around some of the inefficiencies of traditional methods.

He acknowledges that this is a controversial take, but he has spoken to top quantum computer scientists about it, including Professor Zinger and David Deutsch. He believes that this is a promising area of research and that classical systems may be able to model a lot more complex systems than we previously thought. https://www.youtube.com/watch?v=nQKmVhLIGcs

15 Comments
2024/11/24
08:22 UTC

0

Join TYNET 2.0: Empowering Women in Tech through a 24-Hour International Hackathon!

🚀 Calling all women in tech! 🚀

TYNET 2.0 is here to empower female innovators across the globe. Organized by the RAIT ACM-W Student Chapter, this 24-hour international hackathon is a unique platform to tackle real-world challenges, showcase your coding skills, and drive positive change in tech.

🌟 Why Join TYNET 2.0?

Exclusively for Women: A supportive environment to empower female talent in computing.

Innovative Domains: Work on AI/ML, FinTech, Healthcare, Education, Environment, and Social Good.

Exciting Rounds: Compete online in Round 1, and the top 15 teams advance to the on-site hackathon at RAIT!

Team Size: 2 to 4 participants per team.

📅 Timeline

Round 1 (Online): PPT Submission (Nov 21 – Dec 10, 2024).

Round 2 (Offline): Hackathon Kickoff (Jan 10 – 11, 2025).

🎯 Who Can Participate?

Women aged 16+ from any branch or year are welcome!

📞 Contact for Queries

tynet.raitacmw@gmail.com

👉 Register here: http://rait-w.acm.org/tynet

#Hackathon #WomenInTech #TYNET2024 #Empowerment #Innovation

1 Comment
2024/11/23
10:51 UTC

0

Dynamic Lookahead Insertion for Euclidean Hamiltonian Path Problem

11 Comments
2024/11/22
13:14 UTC

0

Correct me if I'm wrong: Constant upper bound on sum of 'n' arbitrary-size integers implies that the sum has O(n) runtime complexity

We have constant upper bound 'b' on sum of 'n' positive arbitrary-size input integers on a system with 'm'-bit word sizes (usually m = 32 bits for every integer).

To represent 'b', we need to store it across 'w = ceil(log_2^m(b))' words.
(number of m-bit words to store all bits of b)
(formula is log base 2^m of b, rounded up to nearest whole number)

Then, each positive arbitrary-size input integer can be represented with 'w' words, and because 'w' is constant (dependent on constant 'b'), then this summation has runtime complexity
O(n * w) = O(n)

Quick example:

m = 32
b = 11692013098647223345629478661730264157247460343808
⇒ w = ceil(log_2^32(11692013098647223345629478661730264157247460343808)) = 6

sum implementation pseudocode:

input = [input 'n' positive integers, each can be represented with 6 words]
sum = allocate 6 words
for each value in input:
    for i from 1 to 6:
        word_i = i'th word of value
        add word_i to i'th word of sum
        // consider overflow bit into i-1'th word of sum as needed
return sum
end

sum runtime complexity: O(n * 6) = O(n)

prove me wrong

edit: positive integers, no negatives, thanks u/neilmoore

11 Comments
2024/11/21
19:14 UTC

Back To Top