Sunday, July 29, 2007

Erlang and Neural Networks article on

Thanks for the reader that submitted my article to, the guy that runs it asked me to post the full article on there. It ended up being about 7th on for about a day or two. So far, I haven't had any complaints on it, so either people haven't read it, or it's been pretty accurate. :)

I didn't think writing that was going to be such a big job when I started, but I suppose I took the position that anyone reading it had minimal mathematical and Erlang background. Therefore, there was a lot of explain. I can appreciate why good textbooks are hard to come by now. :P

But I did learn a lot about Erlang and functional style programming. In addition, Neural networks aren't really mystifyingly magical, like I think many people think of them. I use to think you can just solve anything with them. And while they solve a particular class of problems quite well, they're essentially just a high dimensional gradient descent.

I'll probably work on the neural network code later on--as I haven't written a trainer for it. And I'll probably try to do a particle swarm optimization article in Erlang some other time. In the meantime, I have other things to experiment with and work on. You'll hear about it here first!


  1. Anonymous1:45 AM

    I can say that people have read your article on trapexit, as the average time(based on google analytics) on the page was 7 minutes.. Karl

  2. Anonymous3:10 AM


    the library function for ''Vector map'' is lists:zipwith/3


  3. Whoo. 7 mins isn't too bad. I feel like my writing style rambles too much. I've been working on cutting my posts to be more succinct.

    Francesco sent me this
    link to a post talking about dot_prod

    In which, I agree, the way I wrote dot product wasn't the best. Thanks for the tip on zipwith. I think that post also discusses that and didn't think it was as hot of an idea as it could be done.