you mean let
.
and then letting Hindley-Milner do the rest
you mean let
.
and then letting Hindley-Milner do the rest
Man I just built a new rig last November and went with nvidia specifically to run some niche scientific computing software that only targets CUDA. It took a bit of effort to get it to play nice, but it at least runs pretty well. Unfortunately, now I’m trying to update to KDE6 and play games and boy howdy are there graphics glitches. I really wish HPC academics would ditch CUDA for GPU acceleration, and maybe ifort + mkl while they’re at it.
<esc> <esc> <esc> <esc> <esc> <esc> :wa! <cr>
This post made me go try something in clojure again and man I forgot just how fucking good the language is. Everything fits together so nicely.