/r/compsci

Photograph via snooOG

Computer Science Theory and Application. We share and discuss any content that computer scientists find interesting. People from all walks of life welcome, including hackers, hobbyists, professionals, and academics.

Welcome Computer Science researchers, students, professionals, and enthusiasts!

We share and discuss content that computer scientists find interesting.

Guidelines

Self-posts and Q&A threads are welcome, but we prefer high quality posts focused directly on graduate level CS material. We discourage most posts about introductory material, how to study CS, or about careers. For those topics, please consider one of the subreddits in the sidebar instead.

Want to study CS or learn programming?

Read the original free Structure and Interpretation of Computer Programs (or see the Online conversion of SICP )

Related subreddits

Other topics are likely better suited for:

Other online communities:

If you are new to Computer Science please read our FAQ before posting. A list of book recommendations from our community for various topics can be found here.

/r/compsci

2,811,974 Subscribers

0

Is it okay to work with API's on your resume projects?

I'm starting really try to put a resume together for internships, and a prevalent though is if it's okay to use API's in my projects. Or put the projects i've made with the help of API's. Just asking because looking at other people's resume templates for intern roles or entry roles they never clarify if they used API's, don't know if it's because they're just not doing it or because it's frowned upon to do so. Thank you.

7 Comments
2024/03/25
20:11 UTC

0

When to quit IT degree

So I did decently well in the first half of year one studying IT and got mostly A's in my courses, had major personal issues in the second half of the year and failed 3 out of 4 papers.

I convinced the course manager to let me return and carry on this semester, they arranged for me to complete this half of the semester with 1 less paper than usual to reduce my workload but even still I'm 6 weeks in and have neglected 2 of my papers, and dedicated every night to my paper on OOP.

Despite those many hours, I'm talking 3+ hours every night and neglecting my other papers to the point I've done 0% of those 2 papers.

I am still far behind on my paper on OOP I don't see me being able to finish it and it's due in just 4 days time.

I haven't even completed a table explaining the core classes of my text based video game, their relationships, their scope etc. I still have to complete several UML diagrams, I have to revisit everything I've done already and then I have to write a working code stub in C# to demonstrate the working game and I have no idea how to do it, I couldn't tell you much about C# I can't even remember how to print Hello World.

When do I just call this quits? I don't understand it, I can't find motivation, I can't do self direct learning to a schedule, I don't think I have the cognitive ability for this if I'm being honest and I can't find any joy in this anymore.

6 Comments
2024/03/25
19:33 UTC

0

Any recent Nvidia interview experience ? what to expect ?

Has anyone recently given Nvidia interview ? so far whenever I have given Nvidia interview I have managed to reach only 2nd phone interview. Never onsite. Other than OS, Leetcode and computer architecture what things I should prepare. I have never worked on GPUs so I am really interested in working on those things.

6 Comments
2024/03/25
17:54 UTC

0 Comments
2024/03/25
14:55 UTC

0

Understanding the Challenges and Pain Points of the Pull Request Workflow

Reviewing pull requests is seen as a time-consuming and repetitive task that is often prioritized lower than other work as well as why conflicts often arise at the team level during PRs, leading to integration bottlenecks and dissatisfaction: Challenges and Pain Points - Pull Request Cycle

As a solution, it introduces CodiumAI's PR-agent generative AI tool that aims to address pain points for each persona, offering tailored PR feedback and summaries.

0 Comments
2024/03/25
10:57 UTC

2

Seeking Recommendations for Advanced Data Structures and Algorithms Learning Resources in Python

Hey everyone

I've recently been diving into Python programming and have gotten a good grasp on basic data structures like lists, tuples, and dictionaries. Now, I'm eager to level up my skills and delve into more advanced concepts such as linked lists, sorting algorithms, graphs, and more.

I'm on the lookout for learning materials or courses that are exceptionally clear and beginner-friendly. I want to build a solid understanding without feeling overwhelmed. Additionally, I'm aiming to tackle LeetCode problems down the line, so resources that cover these topics in Python would be ideal.

If you have any recommendations for books, online courses, or tutorials that fit the bill, I'd greatly appreciate it! Thanks in advance for your help.

30 Comments
2024/03/24
06:54 UTC

1

Book Recommendations

If I was to only get one book related to computer science ever, which should it be?

12 Comments
2024/03/23
22:48 UTC

16

What is it that got you more invested into CS?

I’d say knowing about all the technological capabilities that are available today. That and research, that’s what gets me going in the morning.

29 Comments
2024/03/23
14:01 UTC

6

How well can shortest common supersequence over small alphabet size be approximated?

1 Comment
2024/03/23
03:42 UTC

6

Studying Parallel and Distributed Computing in C++

I have good experience with multithreading C and C++(Mainly - fine grained locking, lock free and some threadpool) , message queue, shared memory and socket programming. I have some experience in SIMD.

but I have never used OpenMP or have worked per say on parallel and distributed programming.
What all new things I need to learn to be able to practice.

2 Comments
2024/03/22
15:33 UTC

0

Internet radio with web interface

Hi everyone. I'm looking to build an internet radio with web interface for my work. I prefer a linux based OS on a thin client of whatever type. I'm no particular fan of a RasPI, and whatever software is for the PI should also work on any other debian based OS.

The hardware part isn't most difficult, but I can't find what software I should or can use for this.

The reason for the web interface is that the device will be in an enclosed space.

Anyone a suggestion where to go? thanks for all replies.

4 Comments
2024/03/22
13:32 UTC

1

Training LLMS to follow instructions with human feedback (RLHF) - paper explained

Hi there,

I've created a video here where I talk about how we can train LLMs to follow instructions with human feedback by looking at the OpenAI's RLHF paper that they used to train ChatGPT.

I hope it may be of use to some of you out there. Feedback is more than welcomed! :)

2 Comments
2024/03/22
11:45 UTC

11

Did CPU differences in the past matter for business as much as today's DL/LLM scene between NVDA and AMD?

Hi everyone,

We know that today a lot of DL and LLM research and development rely on NVDA's CUDA, even if AMD is trying to catch up with its ROCm, but it seems that it's not there yet. It seems that there's a large gap between these two, so that people who want to do research or develop DL and/or LLM would usually buy NVDA's products instead of AMD's.

I thought of the following question: Altair 8800 used Intel 8080, and IBM PC used Intel 8088 (and PCs usually came with Intel CPUs until AMD caught up), whereas Apple I and II used MOS 6502, and the first Macintosh used Motorola 68000 (PowerPC era).

So I was wondering that for people who have experienced or studied that era, were there any productivity gap between Intel machines and non-Intel machines, as large as today's NVDA and AMD? Some (small) business seemed to hold up with Apple machines even before its cooperation with Intel, do you think this is or can be true today, e.g., for (small/medium) companies/studios or individuals that can only afford AMD GPUs, or it's more a bet for the future?

Thank you for your time.

10 Comments
2024/03/22
11:12 UTC

0

Why are LLMs bad at deductive reasoning?

I know that LLMs are programmed for pattern recognition rather than "true understanding".

But why aren't they given a component that can do sound logical reasoning, something like Kahnemann's "slow thinking"??

Intuitively it seems to me that deductive inference should be easier to program compared to inductive reasoning, which involves complex learning algorithms requiring a lot of statistics and linear algebra.

65 Comments
2024/03/22
05:27 UTC

22

Alternatives to Von Neumann Architecture

I read a bit about Harvard and Data Flow Architecture and I’m particularly curious why we didnt adopt harvard, given that we have the extra security benefit of preventing buffer overflow attacks and we also get 2 separate buses for data and instructions. It seems we could also preserve the abstraction of sequential execution for the compiler with harvard.

Was it just a historical / simplicity reason we chose Von Neumann and just stuck with it? If anything, the harvard architecture seems even simpler?

17 Comments
2024/03/21
11:50 UTC

23

Behind in my comp sci degree

I’m currently in my second semester of my sophomore year in university and currently in the third main class of CS (advanced algorithm design). I managed to pass the first two classes (data structures and algorithms 1-2) but I feel incompetent. I understand simple ideas but even things like pointers and function designs confuse me. Once I see a completed version of code I can begin to understand it much more but I wouldn’t have been able to make it on my own. I’ve been doing labs using the help of AI to get through them and know that I’m extremely behind on skill and knowledge. This is what I feel I want to do in the future but I’m just super nervous for what’s ahead, especially internships. I’m holding off on taking the next core class and just wanted some input from you guys on what I should do and what would help me. Thanks.

31 Comments
2024/03/21
02:47 UTC

3

Undergrad research project

So I was given the opportunity to start undergrad research recently by a professor in the CS department, and I am making this my passion project.

Last semester, he let my class research a set of given topics, and I chose to focus on the advantages of disadvantages of AI simulation in the military. It was a super cool project and I did pretty well on it.

He wants me to do this so I can have research under my belt and on my resume. I want to focus in on AI and the military, and what issues there are, what’s the current consensus, and what needs to be solved.

With this, I really want to start it. I have some preliminary ideas, such that how can we get AI to have a way of mocking human instinct and intuition, and apply that to a real world scenario.

Thoughts on this?

6 Comments
2024/03/20
19:14 UTC

0

Template Metaprogramming C++

How to practice template metaprogramming in c++ ?

4 Comments
2024/03/20
18:37 UTC

7

Best resources for neuroscientist wanting to learn to code

Hi, I’m a PhD in neuroscience who’s looking to get into coding as a distraction from my experimental work while still being useful. I’ve seen python or matlab are probably the best. Just wondering which would be more useful/beginner friendly and where to start self teaching coding.

Any replies will be greatly appreciated.

31 Comments
2024/03/20
14:50 UTC

4

AMA Session with Clinton Jeffery on Compliers and programming languages

0 Comments
2024/03/20
10:25 UTC

0

Mentoring a Junior Developer - Guide

The guide below explores how structured software engineer mentorship programs and experienced mentors offer guided practice and real-time feedback that propel trainees from theoretical knowledge to practical mastery as well as how effective mentoring can propel their growth and boost your team’s overall success: How to Mentor a Junior Developer: Ultimate Guide

3 Comments
2024/03/20
03:58 UTC

0

PhD / Doctorate Programs

I am planning to do a PhD / Doctorate program in either computer science or data science. I have a list of schools I have found so far where I can do this online. Anyone have any advice on good and/or bad experiences, schools, etc. before I commit to one?

19 Comments
2024/03/19
15:23 UTC

0

Multimethods

Who is into multimethods and generic functions (e. g. as in Common Lisp or Dylan) as distinct from single-dispatch OO (e. g. as in Smalltalk or Ruby)?

For whatever crazy reason, I took it into my head to implement my own system for multimethods (on top of a language that doesn't have them natively). My current need doesn't even require multiple dispatch. I just thought the syntax of calls using generic functions might look better than that of single dispatch and might bode better for meeting future requirements.

Terminology: a programmer adds "rules" to a generic function. Each rule has a head and a tail. The head is a pattern over the arguments to a call. The tail is a "method" i. e. a function or a procedure that gets executed in case the head matches the arguments. Only one rule executes for a given call. Maybe some people would use the term "method" for the head and tail together, but Chat Gupta suggested to the contrary, say "method" for just the tail and "rule" for the whole.

At first, I thought I could make the system catch "ambiguious" combinations of rules and throw an error in that case. But I have given up on that and decided to just make sure the dispatch is deterministic (repeatable) based on, as a last resort, the order in which the programmer added the rules.

I use named parameters in the calls. Rules can specialize on parameter names (constraining their arguments to belong to classes or primitive types). I also want to allow in calls, parameters not mentioned in the rules. In effect, rules that don't mention a given parameter leave it unconstrained.

I am not trying to emulate Common Lisp's :BEFORE, :AFTER, :AROUND.

2 Comments
2024/03/19
11:23 UTC

0

Two theorems of monotone reason, about cardinality and propositional deciders

  1. #P=#Q: The number of Boolean models of a logical form equals the number of valid quantifications. Knuth volume four.

The proof leads to a linear transformation from models to quantifications. My program works well for modest sizes. Monotone forms on n variables decide all 2^n quantifications.

  1. Monotone reason is linearly decidable: Quantified monotone Boolean forms are linearly decidable by plugging one for existential and zero for universal.

Commonsense is monotone. RIP jmc by jdp.

Wisdom is monotone.

Is cognition also linear? Is there a reddit area more appropriate for these two theorems?

avoid negation and prosper in trees of truth, Joseph Daniel Pehoushek

3 Comments
2024/03/19
09:00 UTC

0

Why don't bank numbers contain some sort of check sum to check if they where copied correctly?

Like in bar codes, QR codes, wired protocols ECT.

14 Comments
2024/03/18
17:44 UTC

4

Data compression using Perlin noise or predetermined maps.

Is there some kind of method or way to compress Strings or Arrays,
Like trying to match an array of numbers, each seed of perlin noise.

and if it matches, it returns the seed, and index and length. which sounds kinda op.

2 Comments
2024/03/18
16:35 UTC

0

Ensayo de La robótica en el ámbito informático

Autor : Jairo Antonio mejía

La robótica en el ámbito de la informática ha evolucionado significativamente, transformando la forma en que interactuamos con la tecnología y cómo esta nos afecta como sociedad. Sin embargo, esta evolución plantea preguntas profundas sobre lo humano y lo tecnológico, desafiando nuestras concepciones tradicionales de identidad y ética.

En primer lugar, la integración de la robótica en la informática nos obliga a reflexionar sobre lo que significa ser humano en un mundo cada vez más dominado por la tecnología. Los robots y sistemas autónomos pueden realizar tareas que antes eran exclusivas de los humanos, como la toma de decisiones complejas o la interacción social. Esto plantea la pregunta de si nuestra humanidad reside en nuestras capacidades cognitivas y físicas, o si hay aspectos más sutiles de la experiencia humana que los robots no pueden replicar.

Además, la robótica plantea desafíos éticos urgentes que deben abordarse. ¿Qué responsabilidad tenemos como creadores de tecnología para garantizar que los robots actúen de manera ética y respetuosa? ¿Cómo podemos proteger la privacidad y la seguridad de las personas en un mundo donde los robots tienen acceso a vastas cantidades de datos? Estas preguntas nos obligan a considerar cómo podemos asegurar que la tecnología beneficie a la humanidad en lugar de perjudicarla.

Por otro lado, la robótica también ofrece oportunidades emocionantes para mejorar nuestras vidas. Los robots pueden ayudar en tareas peligrosas o monótonas, liberando a los humanos para realizar trabajos más significativos y creativos. Además, la interacción con robots puede proporcionar compañía y apoyo emocional, especialmente para personas mayores o con discapacidades.

En última instancia, la integración de la robótica en la informática nos desafía a repensar nuestra relación con la tecnología y entre nosotros mismos. Debemos abrazar el potencial de la robótica para mejorar nuestras vidas, al tiempo que nos mantenemos alerta ante los desafíos éticos y existenciales que plantea. Al hacerlo, podemos forjar un futuro en el que la tecnología y la humanidad coexistan de manera armoniosa y enriquecedora.

4 Comments
2024/03/18
16:08 UTC

Back To Top