by niels on 4/10/14, 10:49 AM with 199 comments
by AlexanderDhoore on 4/10/14, 7:02 PM
I makes me think of Elm [1] and (functional) reactive programming. Reactive programming is fantastic. It's kind of like how a spreadsheet program works. If a variable changes, all variables who depend on it change as well. Given "a = b + c", if c increments by 1, so does a.
It has many advantages over event based systems, like Javascript. Reactive programs don't need callbacks. The changing values propagate the "event" through the system.
I'd love to hear what you guys think about this direction of programming. It seems very natural to me.
Edit: I also see reactive programming as the golden way of having changing state in functional languages. Functional languages have no problem with data or state. They have a problem with change of state. The reactive paradigm solves that problem. All change is implicit and code can be exactly as functional as before.
by jameshart on 4/10/14, 12:01 PM
by simias on 4/10/14, 11:30 AM
Learning Verilog was an eye opening experience for me. It reminded me of the time I switched from unstructured BASIC to C when I was a kid. At first it seems complex and weird then suddenly it clicks and it all starts making sense.
by jarrett on 4/10/14, 3:31 PM
Can a dependent type system catch all type errors at compile time? For example, suppose I write the following (in pseudo-code):
// Variable x is an integer greater than or equal to 0 and less than 256.
int x (>=0, <256)
x = 128 // This is valid.
x = x * 3 // This violates the type.
I can imagine how a compiler could catch that kind of error. But that's trivial. What happens in programs like this: int x (>= 0, <=10)
x = parseInt(getKeyboardInput)
Now the compiler can't know for sure whether the type has been violated, because the value of getKeyboardInput could be anything. To take a page from Haskell, you could do something like this (which is still pseudocode, not valid Haskell): // x is a value that is either 1) an int from 0 to 10, or 2) nothing at all.
maybe (int (>= 0, <=10) x
// applyConstraint recognizes that parseInt may return a value violating x's contraints.
// Thus it transforms the return type of parseInt from int to maybe (int (>= 0, <=10)
x = applyConstraint(parseInt(getKeyboardInput))
Or perhaps applyConstraint wouldn't have to be called explicitly, but would be implicitly added by the compiler as needed. I'm not sure which is better stylistically.Either way, applyConstraint would be required any time a computation could return an invalid value. That would get tricky, because the compiler would have to track the constraints on every variable, even where those constraints aren't declared. For example:
int w (>= 0, <= 10)
int x (>= 0, <= 2)
int y
int z (>= 0, <= 20)
y = w * x
z = y
Here, the compiler would have to infer from the assignment "y = w * x" that y is always between 0 and 20.Do any languages currently take the idea this far (or farther)?
by lolo_ on 4/10/14, 12:56 PM
There's an interesting article [1] on how Jonny Greenwood of Radiohead uses it extensively, in there you can see some examples of how it works - modules wired together visually.
I think there is a lot of potential for a really nice mix between text-based programming and graphical programming to work for general programming too.
[0]:http://cycling74.com/products/max/ [1]:http://thekingofgear.com/post/25443456600/max-msp
by bru on 4/10/14, 11:30 AM
- parallel and concurrent are 2 different things
- the 'symbolic languages' definition seems off. Wikipedia puts it right:
> symbolic programming is computer programming in which the program can manipulate formulas and program components as data
So it's not "using graphs & such to program"
by hexagonc on 4/10/14, 5:02 PM
[1] http://en.wikipedia.org/wiki/HP-48_series [2] http://en.wikipedia.org/wiki/Conway%27s_Game_of_Life
by untothebreach on 4/10/14, 1:51 PM
1: factorcode.org
by milliams on 4/10/14, 1:41 PM
by sirsar on 4/10/14, 1:44 PM
I rarely use it because organization is such a pain, but its "data-flow" paradigm does simplify a lot of logic.
by prezjordan on 4/10/14, 3:35 PM
by saosebastiao on 4/10/14, 5:36 PM
I think that any future that the Declarative paradigm has within general purpose languages is the kind applied by compilers. For example, x = (a + b) - a can be reduced to x = a or even eliminated altogether with subsequent in-scope references to x being replaced with a. Another example is dead code elimination. These forms of declarative let you use an imperative or functional language immediately but gently introduce you to declarative benefits without having to deal with all the mind bending that is necessary to optimize pure declarative code.
by z3phyr on 4/10/14, 11:52 AM
by mjb on 4/10/14, 1:29 PM
Both Promela (Spin) and TLA+ have active communities and have found fairly wide use in industry. They are generally used for model checking, model extraction by guided abstraction, and development by refinement, but can be used in a much more adhoc way to just experiment with parallel ideas.
by snorkel on 4/10/14, 11:59 AM
by sergiosgc on 4/10/14, 5:04 PM
Is this line of evolution in languages considered dead?
by kitd on 4/10/14, 1:57 PM
[1] https://code.google.com/p/anic/wiki/Tutorial [2] http://lampwww.epfl.ch/funnel/
by josephschmoe on 4/10/14, 6:26 PM
A true Code Search would work like this: 1. Type in your search term in your code in a comment line. i.e. "Bubble sort StampArray by name" 2. Google/Bing/StackOverflow searches for your string. Replaces your terms with generics. Searches for "Bubble sort [an array of objects] by [string variable]" 3. Takes code results and shows them to you. Replaces all instances of [string variable] with getName() and all instances of [Object[]] with StampArray. 4. You pick your favorite. 5. Your IDE adds the code to a "code search module" which you can edit. 6. Your edits get added to the search database.
The best part? You could even put your Declarative Programming engine -inside- of the Search just by populating initial search results. What about better code coming to exist in the future, you say? Well, you don't necessarily have to keep the same result forever. If it's been deprecated, you can re-do the search.
by protomyth on 4/10/14, 4:15 PM
by josephschmoe on 4/10/14, 6:02 PM
by keenerd on 4/10/14, 6:03 PM
"It feels like I am sitting at the controls of a quantum computer. I've got all my qubits (terms) all wired together in some complicated expression and when power is applied every qubit will instantly collapse out of superposition and crystallize into a perfect answer."
(From something I've been working on, http://kmkeen.com/sat/ )
by Blahah on 4/10/14, 11:31 AM
by aufreak3 on 4/11/14, 4:12 PM
Dealing with process coordination using the resolution of logical variables gave me a refreshing new perspective. The finite domain constraint system design in Oz is an awesome example of this in action.
An interesting tidbit - the Mozart/Oz team invented "pickling" before it caught on with Python.
by danielweber on 4/10/14, 2:54 PM
I might be confusing this with custom literals.
by josephschmoe on 4/10/14, 6:06 PM
by JupiterMoon on 4/10/14, 1:22 PM
by kazagistar on 4/13/14, 1:56 PM
by cowls on 4/10/14, 12:38 PM
by joshlegs on 4/10/14, 8:53 PM
http://www.reddit.com/r/programming/comments/22nhb2/six_prog...
by SeanLuke on 4/10/14, 1:30 PM
This is so wrong I don't know where to begin.