/r/ProgrammingLanguages
This subreddit is dedicated to the theory, design and implementation of programming languages.
This subreddit is dedicated to the theory, design and implementation of programming languages.
Be nice to each other. Flame wars and rants are not welcomed. Please also put some effort into your post, this isn't Quora.
This subreddit is not the right place to ask questions such as "What language should I use for X", "what language should I learn", "what's your favourite language" and similar questions. Such questions should be posted in /r/AskProgramming or /r/LearnProgramming.
/r/ProgrammingLanguages
https://ryanbrewer.dev/posts/implicit-products-better-forall.html
A blog post on a really cool idea in type theory, what it means and how it's used.
EDIT: Fixed the link. Sorry about that!
Let's imagine a general-purpose language that provides a summation function for lists of floats in its standard library. Instead of implementing it naively, there are approaches that can reduce floating point error accumulation in the sum.
However, this will also mean that
floats.reduce((sum, num) => sum + num) !== standardLibrarySum(floats)
Are there any cases where this inequality would cause inconvenience, or is precise summation more or less a no-brainer for the standard library (assuming a summation function is provided at all)?
I'm working on my programming language, here is a simple prototype. I would love to have simple syntax highlighting in VS code (which is my main IDE). Looking for ideas I found that VS code is using Textmate for tokenization. Is it a good idea to use TextMate grammar to tokenize my language? I would like to write a parser and bytecode generation in Python, a simple VM in C. This also can be a problem because I found that vscode-textmate is written in js, is this a problem that tokenization and parsing are written in different languages? What's your experience on that?
Are there mainstream languages that would let me encode properties such as list monotonicity or something like [x%2==0 for x in some_list] in the type checker itself?
Am I looking for dependent types?
Languages must be turing complete on their own. So.... no html or css exclusively.
Other than that go wild!
I am going back and reworking the syntax of my language a little bit. I ultimately decided on eliminating semicolons from my language grammar, and using significant indentation. Strangely, despite being by far the more controversial decision, significant indentation was far easier to implement.
However, I am having quite a bit of trouble determining how to parse sequences of expressions. My language is expression based, but allows for sequences of expressions-as-statements (like in F#). My language also has function application without requiring parenthesis.
I can't simply terminate statements on a new line, because I want to allow for things like lambdas/lets to span multiple lines, and I want to allow infix operators to continue lines. I also can't use the Swift approach of injecting semicolons where production stalls, because for my language production never stalls: it just ends up as a function application! F#'s approach seems like it would work but it is quite complex, keeping a stack of contexts to determine when to inject tokens.
I have seen this article about semicolon inference but it doesn't seem like any of them (except maybe Scala's?) fit in this context. Does anyone have advice for what to do here? Maybe there is some other approach that can take advantage of the fact I have already committed to significant whitespace? Or should I just take the stack-of-contexts approach F# does?
TL;DR: My language both lacks a statement terminator and allows application by juxtaposition. How can my parser tell when a expression-as-a-statement ends in a sequence of "statements"?
Traditionally, the Tokenizer takes source code as input and repeatedly max-munches a token from it, as defined by a set of regular expressions.
This means that a traditional Tokenizer cannot handle nested comments such as `/* /* */ */`.
How do you approach this problem in your programming language?
I'm asking this specifically in the context of an imperative programming language that's meant to be easy to learn. Although it would be interesting also to hear opinions in a more general sense.
I feel like having complete type inference could help decrease the learning curve and make easier to learn the language more gradually. On the other side, having mandatory top level type annotations could make the compiler simpler, faster and easier to have good error messages. Another advantage (or disadvantage) is that it would allow to add overloading trivially to the language, and wouldn't be hard to think about how the compiler resolves the calls.
What do you think?
edit: removed word out of place, fixed grammar.
I’m trying to define OOP concepts using only functions and variables in a dynamic type language.
I have successfully defined classes and objects, but can not seem to figured out a way to defined inheritance.
Lambda calculus deals with function application while typed version of it deals with admissible types of applied values. The typed version is clearly described with lambda cube.
But instead of usual lambda cube typing, I'm trying to put classical logic on top of untyped lambda calculus to define typing. That way, beside implication, we would automatically get access to and, or, and not operators (which have some analogies with usual set operators). Although these additional logical operators are definable by implication and falsity, I wonder how simple a language automatically supporting them would look like.
Did anyone encounter something similar around the web, or dealt with it in their research?
[Edit]
I was expecting something simple to present it as an example in my duality based formal system creator because I already prepared separated simple examples for untyped LC and classic logic. But it seems there is more to combining them than what I thought in the first place. There is a whole stunt around the negation, and I would like to avoid it very much.
I decided to settle down with positive logic, and deal with conjunction and disjunction in a way that sequent calculus does (because my system is already based on sequents in a role of deductive rules), turning them with intro and elim rules into implications. Implications would then be treated in a typewise known way synchronized to applying parameters to functions. Of course, variables would be just an add-on to be able to reach the whole lambda-cube thing. But more about that later...
I would like to thank you all for getting me to this solution.
Has anyone been posting solutions in your (own) language to AoC 2023? I came up to day 5 last year, then I didn't have enough time for it any more. Currently I am on Day 2. I think it's super useful since it tests a language and ideas with a problem, you wouldn't think of yourself when testing it, so I always find new bugs, problems and challenges to the language.
Building on this discussion
and my conclusion to avoid LLVM, I'm wondering how much effort it would take to develop a frontend for GCC? Can anyone say something about this?
For example, a frontend is currently being developed for Rust.
Hello,
I have been wondering for a while if there exists a programming language where types subsume patterns, i.e. where patterns do not exist and are instead replaced by high precision types.
Intuitively, it looks like it should be possible to me, using a combination of features like subtyping, singleton types, union types and negation types. However, I have not been able to find such a language during my searches, so I am asking you if you know some, or something that might get close to it.
EDIT: Here is an example of what I have in mind.
EDIT 2: After reading the replies and giving it more thought. I guess "subsumption" is too strong of a word, as destructuring bindings still require patterns to be different from types. The idea is more "type-based pattern matching", where patterns describe a value only through its type, which is also used to perform exhaustivity checks.
I want my stdlib Dictionary type (HashMap) to be generic. I want to be able to parametrise it not only with specific types, but rather generalise it to key type constructor and value type constructor. These two type constructors must agree on their arguments. Values for the type constructor arguments can vary per hash table entry.
Something like this:
struct Dictionary<Args: /* ??? */, Key: (Args) -> Type, Value: (Args) -> Type> {
subscript<A: Args>(key: Key<A>) -> Value<A>? {
get {}
set {}
}
}
struct EnvironmentKey<ValueType> {}
typealias SameType<T> = T
var dict: Dictionary<EnvironmentKey, SameType> = [:]
func getEnvrionmentValue<T>(key: EnvironmentKey<T>) -> T? {
return dict[key: key]
}
struct String {}
struct Int {}
var simple: Dictionary<String, Int> = [:]
But also I want it to be usable with plain types. So Args
is something which abstracts over zero or more arguments to the type constructor. What is Args
in this case? What kind of type system is this? Any reading recommendations?
I am looking for something of the sort just to learn. I wanted to dive as deep as I could without using external things (like LLVM IR for the backend, not to mention LLVM confuses me as a beginner). My goal was eventually to one day make something that takes the best out of all the popular and underrated programming languages and mash them together.
In most languages that support array-array comparison, it is 1 dimensional. That is
01 | if [a, b, c] == [d, e, f] :
02 | -- do stuff
03 | -- this expands to
04 | h1 = false
05 | arr1 = [a, b, c]
06 | arr2 = [d, e, f]
07 | for i in range(len(arr1)) :
08 | h1 = h1 and (arr1[i] = arr2[i])
09 | if h :
10 | -- do stuff
2D Array-Array Comparison would be as follows:
11 | -- these variables could instead be constants
12 | a: int = 1
13 | b: int = 2
14 | c: int = 3
15 | d: int = 4
16 | e: int = 5
17 | f: int = 6
18 |
19 | if [a, b, c, d] |=| [e, f] :
20 | --do stuff
21 | -- this expands to
22 | arr1 = [a, b, c, d]
23 | arr2 = [e, f]
24 | h1 = false
25 | for var1 in arr1 :
26 | h2 = false
27 | for var2 in arr2 :
28 | h2 = h2 or (var1 = var2)
29 | h1 = h1 or h2
30 | if h :
31 | -- do stuff
32 |
33 | if [a, b, c, d] |=& [e, f] :
34 | -- do stuff
35 | -- this expands to
36 | arr1 = [a, b, c, d]
37 | arr2 = [e, f]
38 | h1 = false
39 | for var1 in arr1 :
40 | h2 = true
41 | for var2 in arr2 :
42 | h2 = h2 and (var1 = var2)
43 | h1 = h1 or h2
44 | if h :
45 | -- do stuff
46 |
47 | if [a, b, c, d] &=| [e, f] :
48 | -- do stuff
49 | -- this expands to
50 | arr1 = [a, b, c, d]
51 | arr2 = [e, f]
52 | h1 = true
53 | for var1 in arr1 :
54 | h2 = false
55 | for var2 in arr2 :
56 | h2 = h2 or (var1 = var2)
57 | h1 = h1 and h2
58 | if h :
59 | -- do stuff
60 |
61 | if [a, b, c, d] &=& [e, f] :
62 | -- do stuff
63 | -- this expands to
64 | arr1 = [a, b, c, d]
65 | arr2 = [e, f]
66 | h1 = true
67 | for var1 in arr1 :
68 | h2 = true
69 | for var2 in arr2 :
70 | h2 = h2 and (var1 = var2)
71 | h1 = h1 and h2
72 | if h :
73 | -- do stuff
The compiler would transform the arrays of variables into that of constants first for optimization. Are there languages that have this feature? If not, what use cases would warrant this feature?
Let's say I'm in Python (because I am), and I'm type hinting (because I like types), making a simple Grid class.
Coord = tuple[int, int]
class Grid:
...
def __getitem__(self, key: Coord):
return self.grid[key[1]][key[0]]
This makes sense, I can say grid[1, 2]
to get my value rather than that nasty grid[2][1]
. But then I had the bright idea of using a namedtuple
for my Coord:
Coord = namedtuple('Coord', ['x', 'y'])
Of course, then I realized that if I ever want to use key.x
and key.y
(or pass a type checker) I have to actually say grid(Coord(1,2))
, which is obviously a nonstarter.
Is there a language feature in any language where I can say that f(c)
takes in a namedtuple or equivalent but still be able to pass in a primitive tuple, without having the first line of my function be c = Coord(c)
? This feels familiar to me, kind of like structural pattern matching, but I can't actually think of a language that works like this.