-
Recent Posts
Recent Comments
- Linas Vepstas on Everything is a Network
- Borislav Iordanov (@bolerio) on Everything is a Network
- Oleksandr Fedorov on Value Flows
- GAVIN MCCREADIE on Many Worlds: Reasoning about Reasoning
- Nil Geisweiller on Value Flows
Archives
- June 2021
- April 2020
- January 2019
- October 2018
- May 2018
- October 2017
- December 2016
- October 2016
- August 2016
- November 2015
- January 2014
- September 2013
- March 2013
- July 2012
- March 2012
- February 2012
- August 2011
- May 2011
- February 2011
- December 2010
- September 2010
- August 2010
- February 2010
- November 2009
- October 2009
- September 2009
- July 2009
- June 2009
- April 2009
- March 2009
- February 2009
- January 2009
- October 2008
- September 2008
- August 2008
- July 2008
- June 2008
- May 2008
Categories
Meta
Category Archives: Introduction
COVID-19 Modelling and Random Social Networks
Seems like everyone wants to be an epidemiologist these days, so why not OpenCog? After all, diseases spread through networks, propagating from one node to the next. A network is a graph, the AtomSpace is a graph database, and the … Continue reading
Posted in Design, Development, Documentation, Introduction, Theory
Leave a comment
Value Flows
Graphs and graphical databases are now accepted as a very good (the best?) way of capturing the relationship between things. Yet, many of the “things” represented graphically are actually processes. An example might be the water cycle in nature: rain … Continue reading
Why Hypergraphs?
I’ve recently been hacking on creating a new parser for the Link Grammar theory of natural language parsing. I want to couple parsing to machine learning (ML), to that I can use ML to learn natural languages. To do that, I need to place everything in a certain abstract data representation framework that allows graph rewrite rules, logical reasoning, and Bayesian probabilistic reasoning to be combined. This framework exists in OpenCog, but few people know or understand this. That this framework also has a firm foundation in model theory, category theory (even n-categories!) and type theory is even less well known. To explain all this, I just wrote a simple, easy introduction to all of these ideas, and how they come together. Follow the link for more. Continue reading
Posted in Design, Introduction, Theory
Tagged dependency grammar, HyperGraphDB, Learning, linguistics, link-grammar, MachineLearning, Natural Language Processing, OpenCog, PLN, RelEx
40 Comments
The AGI Summer School 2009
In the middle of last year, Xiamen University hosted the first international summer school on Artificial General Intelligence. While several of the core OpenCog developers, and Ben Goertzel were there to teach, it passed by somewhat quietly on our blog … Continue reading