/r/computerscience
The hot spot for CS on reddit.
Welcome to /r/ComputerScience!
We're glad you're here.
This subreddit is dedicated to discussion of Computer Science topics including algorithms, computation, theory of languages, theory of programming, some software engineering, AI, cryptography, information theory, and computer architecture.
For more detailed descriptions of these rules, please visit the rules page
/r/computerscience
Sorry if this has been asked before, but I am new to computer science and will be working from home. I was wondering if there was anything that people recommended having in their home office. Whether its decor, cheatsheets, books, etc. Basically, if you have felt that your life as a data analyst has been made easier by a specific thing, I wanna hear about it.
new student here, the topic seems to have simple principles but messy over-lapping concepts that are precise and need a lot of contemplation and comprehension if one wishes to get a hold of it entirely, any tips on how to master it?
Link Article: https://www.mdpi.com/2076-3417/14/19/8983
If you would like to share the article it is greatly appreciated.
Thank you
info in comments
Just wondering since I find the concepts very interesting, but I'm faced with many differing opinions. It's literally an r/AskReddit type question but CS lol
An example of another alternate computing paradigm is neuromimetic computing
Hello Folks,
I am in Uni and learning CS. I finished learning and understanding Computer Architecture and Organization (CAO) and now heading to study Operating System.
In learning CAO, there was no usage of any HLL like C, Python etc. Before diving in OS, I have some queries:
Do I need to use HLL while learning OS? Or I just need to focus on concepts and then implement them if needed in each language as per need?
I keep on seeing things like fork(), exec(), - though I don't know what are these now, but are these OS concepts that will remain same always, independent of language (HLL) etc. used? Or is this something specific in one language and alongside I should focus to learn them specific to HLL I'm going to use?
Do different HLL comes with different libraries to implement OS concepts? I mean a library and set of function to implement OS concepts- e.g. C, C++, Python
I hope you're getting gist of my concerns.
Thanks!
My job is willing to provide books for developers, and I want to pick up a good coffee table book which highlights some computing concepts that highlight logic gates, what a processor is actually doing, or other concepts that a CS individual might find intriguing. I'm not looking for something that will take my life to the next level or anything, just something light-hearted - it could even be targeted to children.
Hello there,
I had an idea that I think could be interesting to discuss, and I’d love to get some feedback from the community.
Here’s the idea:
Imagine a compiler (would that be a compiler ? donna, but say a tool then) that identifies a time-constant function at the base level of a program. In this scenario, we’d have access to all source code, and the compiler could define this time constant as a "tick" (similar to the concept of ticks in Minecraft). The compiler would then use this tick to measure performance and profile the application.
This approach could make performance comparisons much easier and more meaningful, especially across different architectures. Instead of dealing with variable timing metrics that don’t always translate well between platforms, we’d have a consistent tick-based measure for more direct comparisons.
Some questions to throw out there:
Looking forward to hearing everyone’s thoughts on this!
I do it as part of my job and now I realise that coding really sucks. Just wondered if I was the only CS graduate who found that they actually dislike coding.
I am looking for a way to seperate the sound coming from a specific location from other sounds. For example, and I know this is a very bold idea to think about, I want to extract the sound of let's say my friend playing in a orchestra. Basically, I'm looking for a way to map out where almost every sound came from a recording. I can't really find any papers that has done an analysis about this problem yet and quite frankly, this problem pique my interest recently and I really want to know how some of you might approach this problem. Any idea or lead is much appreciated.
In academic and theoretical computer science research, areas like algorithmic complexity, is a background in pure and discrete mathematics valued and useful? Or is an applied, tool-based background generally preferred? If the answer depends, what factors does it depend on?
I would appreciate your insights.
Hey everybody, I'm currently taking Algorithms and Data Structures in my second year, but so far didn't really have too much time to actually study. Now that I'm over my calc2 midterm I'm looking for the best places to learn about this subject.
Mostly looking for video explanations, maybe youtubers or courses about the topic but if you have a book recommendation or anything else, I would be grateful for that too!
Thank you for reading it!
Hi People I’m learning pipelining and after studying Synchronous, my coursework started teaching instruction pipelining. I have a query- is Instruction pipelining synchronous? Or it can be asynchronous
I’m mostly looking for overviews before I jump in studying to know what I’m learning
Hi Computer Science people.
Do you know if there are existing hardware/software implementations that leverage the ordinal positions of the true bits in a byte for sorting optimization?
Since values are represented in binary are already represented in order.
For Example:
Something like this I'm thinking.
Step 1, Read in Data
Step 2. Define the ordinal
Step 3. Establish which frame to build in
Step 4. Use ordinal values to build tree
Step 5. On ordinal value=Byte Size (Land Binary into bucket using previous (n-1) frame info and ordinal.
Step 6. Store current (n-1) frame info and ordinal
Step 7. Destroy tree frame.
Step 8. Repeat till data feed stops
Hey guys, I already asked this question in r/MLQuestions but I figured I'd try fellow compsci colleagues here as well. Hope I'm not breaking rule number 9, but I think it's interesting enough to be asked here too.
I'm working on a classifier for detecting the topic or a category of a web page based on analysis of its source and possibly URL. Practically it means I have an anotated dataset containing the URL, scraped source code and the category of the web. Probably going with XGBoost, Random Forest and similar methods and comparing the results later to evaluate accuracy.
Which features do you think I could extract and would actually be good for classification of the website into a predefined category?
Some good recommendations I got was using bag of words or more complicated methods like TD-IDF or even BERT, but perhaps you guys here would have more ideas what could be good, I thought utilizing tags, styles or scripts on the site could be interesting, but I can't really figure out how exactly, perhaps someone here would have an idea.
Thanks a lot and have a nice start into the week.
And if so, what kinds of programs could it implement?
In Neural Networks. A neuron typically holds a value between 0 and 1, especially in layers where activation functions like the sigmoid function are used. This range allows the neural network to model probabilities or degrees of confidence in binary classifications.
A value between 0 and 1 is a float. A percent/ratio/probability can be stored as an integer in a range of 0 to 100. Although both usually take 4 bytes, Integers are generally more space-efficient and faster to process, as they require less storage and computational overhead.
Do you think changing the representation of this could give any significant performance? I see companies like google using nuclear power plants to power their server for these transformers. What are your thoughts
MISD (Multiple Instruction streams Single Data stream)
SISD (Single Instruction stream Single Data stream)
I understand SISD: Fetch one instruction->decode->fetch data for it->execute
But when I read MISD, it says, fetch multiple instruction->decode them->fetch data (same data for all instructions)-> instead of executing all instructions in parallel, execute them sequentially one by one
So, isn't MISD something similar to SISD since at the end it executes sequentially, correct me if I am wrong
I am confused with these naming
I have the bases of them, but as I never went to uni I never practiced this well enough.
I never really understood it really well, so i want to start from scratch. Is there a really good book with very good examples that will teach me all of computer networks? I want to understand it top to bottom.
Thanks in advance!
how would you understand (mathematically) a filter is linear or not. For example
h(x, y) = 5f(x, y)- 1f(x−1, y)+ 2f(x+ 1, y)+ 8f(x, y−1)- 2f(x, y+ 1)
is h linear in this case?
Actually what I know is that to call a function linear, we should show it's homogeneous and additive.
So I tried to show it's homogeneous with following: for some constants a and k, if h(ax,ay) = a^k h(x,y) , then it's homogeneous. But I stuck on h(ax, ay) = 5f(ax, ay)- 1f(ax−1, ay)+ 2f(ax+ 1, ay)+ 8f(ax, ay−1)- 2f(ax, ay+ 1) and I don't actually know how to remain.
is main () a global or local function ?
Suggest me resources for studying CS Core Topics and C++ in-depth!
Hi! So my interviews are up quite soon and I really want to revise the CS Core topics in and out. Kindly suggest me resources to study the topics (OS, DBMS, CN, OOPS majorly), as well as C++ in depth (I know C++ syntactically well enough to practise DSA and CP).
I’m first year bachelor student in cs, it is really tough for me right now,I feel like I’m lost in discrete math. Does anyone know a way to get better in that course
As it is now, I have no idea how to program, and I do not understand the java programming language enough to do anything on my own beyond trivial objects with print statements and if statements.
I had trouble coming to this conclusion prior because I had made an effort to try and learn to program prior through the typical 'intro to java' courses, and find tutorials such as 'learning godot engine' Even though it felt as though I was just copying code with no explanation.
I think I am relatively ok at looking at language exempt/language independent descriptions of algorithms and their exercises through videos and on paper, when I ask certain questions about the algorithm eventually the answer is that it will make sense once I actually code, which is when things go south.
Hey guys
Searched a lot but couldn't find an answer to what's the difference between CSAPP 3rd edition and 3rd global edition.
Thanks in advance!
Hello all,
Okay, this will sound like an incredibly dumb question. In my almost 2 decades (context) of software engineering, there is one thing I have long struggled with: which direction to make an arrow point in my notes, impromptu explanatory diagrams, and formal documentation.
There are cases where this is obvious: diagrams that show the flow of information, including classic flow charts (does anyone use these though?) or network diagrams where directionality has a clearly defined meaning.
However, if you say "A abstracts B" you might just as well say "B refines A". Same as "A depends on B" or "B is referenced by A".
Or even more abstractly, when you are relating concepts, each of those relations may be different within a single diagram! This more happens in personal notes and mind mapping.
I'm wondering if there's a general, perhaps obnoxiously/delightfully abstract, answer to this dilemma.
Thank you!
Bestieboots.
If making an algorithm to beat humans at 4x games (like civ 6) was as big of a deal as making a chess engine to beat humans was in the 1900's, could it be done? The disagreement is: making an algorithm of that complexity could be done if it had the significance that the chess algo did in the 60-90's despite the difference in complexity vs it simply not being feasible? The reasoning as to why an algorithm like this hasn't been made yet is because the problem is not significant enough to receive the resources needed to be "solved," whereas a machine beating a grandmaster in the 90's was a big enough deal to receive said resources vs it being too complex of a problem to compute.