We are happy to announce that the book of A. Bergel is out
https://www.apress.com/gp/book/9781484253830
Agile Artificial Intelligence in Pharo
Implementing Neural Networks, Genetic Algorithms, and Neuroevolution
Cover classical algorithms commonly used as artificial intelligence techniques and program agile artificial intelligence applications using Pharo in this book. It takes a practical approach by presenting the implementation details to illustrate the numerous concepts it explain…
As a book author I want to congratulate alex for this effort.
S.
Stéphane Ducasse
http://stephane.ducasse.free.fr / http://www.pharo.org
03 59 35 87 52
Assistant: Aurore Dalle
FAX 03 59 57 78 50
TEL 03 59 35 86 16
S. Ducasse - Inria
40, avenue Halley,
Parc Scientifique de la Haute Borne, Bât.A, Park Plaza
Villeneuve d'Ascq 59650
France
Thanks, Stef, the book is excellent!
Best wishes,
Tomaz
write it on amazon :)
On 2 Jul 2020, at 09:56, Tomaž Turk tomaz.turk@ef.uni-lj.si wrote:
Thanks, Stef, the book is excellent!
Best wishes,
Tomaz
Stéphane Ducasse
http://stephane.ducasse.free.fr / http://www.pharo.org
03 59 35 87 52
Assistant: Aurore Dalle
FAX 03 59 57 78 50
TEL 03 59 35 86 16
S. Ducasse - Inria
40, avenue Halley,
Parc Scientifique de la Haute Borne, Bât.A, Park Plaza
Villeneuve d'Ascq 59650
France
Thanks Tomaž for your nice words.
Comments on amazon are really important.
,.;:~^~:;.,.;:~^~:;.,.;:~^~:;.,.;:~^~:;.,.;:
Alexandre Bergel http://www.bergel.eu
^~:;.,.;:~^~:;.,.;:~^~:;.,.;:~^~:;._,.;:~^~:;.
On 03-07-2020, at 16:11, Tomaž Turk tomaz.turk@ef.uni-lj.si wrote:
https://www.amazon.de/Agile-Artificial-Intelligence-Pharo-Neuroevolution/dp/1484253833/ref=sr_1_2?dchild=1&keywords=artificial+intelligence+pharo&qid=1593807040&sr=8-2 https://www.amazon.de/Agile-Artificial-Intelligence-Pharo-Neuroevolution/dp/1484253833/ref=sr_1_2?dchild=1&keywords=artificial+intelligence+pharo&qid=1593807040&sr=8-2
Best wishes
Tomaz
I noticed that if I trained the OR gate with 31 iterations, the first test
would fail.
If I train with 32 iterations, then the test for #(0 0) would pass.
What is the explanation for this "tipping point"?
The other tests pass, even if timeRepeat := 1.
I just decided to experiment a bit.
--
Sent from: http://forum.world.st/Pharo-Smalltalk-Users-f1310670.html
Hello,
Sorry to reply late. I just saw this email.
Each training modify the weights and bias of the perceptron by a small quantity. It is just that, in this very particular case, from 31 or 32 is the drop that makes all things change...
,.;:~^~:;.,.;:~^~:;.,.;:~^~:;.,.;:~^~:;.,.;:
Alexandre Bergel http://www.bergel.eu
^~:;.,.;:~^~:;.,.;:~^~:;.,.;:~^~:;._,.;:~^~:;.
On 21-08-2020, at 01:31, bentai bentaisan@gmail.com wrote:
I noticed that if I trained the OR gate with 31 iterations, the first test
would fail.
If I train with 32 iterations, then the test for #(0 0) would pass.
What is the explanation for this "tipping point"?
The other tests pass, even if timeRepeat := 1.
I just decided to experiment a bit.
--
Sent from: http://forum.world.st/Pharo-Smalltalk-Users-f1310670.html