Some progress with InfoGAN

I’ve recently shared my first (and unfortunately relatively disappointing) results of applying InfoGAN [1] to simple shapes. Over the past weeks, I’ve continued to work on this, and my results are starting to look more promising. Today, I’m going to share the current state of my research.

Continue reading “Some progress with InfoGAN”

Extending Logic Tensor Networks (Part 1)

I have already talked about Logic Tensor Networks (LTNs for short) in the past (see here and here) and I’ve announced to work with them. Today, I will share with you my first steps with respect to modifying and extending the framework. More specifically, I will talk about a problem with the original membership function and about how I solved it.

Continue reading “Extending Logic Tensor Networks (Part 1)”

Logic Tensor Networks and Conceptual Spaces

In my last blog post, I have introduced the general idea of Logic Tensor Networks (or LTNs, for short). Today I would like to talk about how LTNs and conceptual spaces can potentially fit together and  about the concrete strands of research I plan to pursue.

Continue reading “Logic Tensor Networks and Conceptual Spaces”

What are “Logic Tensor Networks”?

About half a year ago, I mentioned “Logic Tensor Networks” in my short summary of the Dagstuhl seminar on neural-symbolic computation. I think that this is a highly interesting approach, and as I intend to work with it in the future, I will shortly introduce this framework today.

Continue reading “What are “Logic Tensor Networks”?”