# Hindman’s Theorem, partial semigroups and some of my most lacking intuitions (part 7)

Hm, my writing process is slowing down a little (and on top of that I forgot to publish this draft) and there are other posts that I really want to write. I’m not really sure how I will proceed, but let’s keep pushing a little further for now. Last time I tried to build a bridge from central sets via idempotent ultrafilters back to partial semigroups. This is one of the key points of this series: connecting central sets and partial semigroups.

## Back to partial semigroups

Here we are, lots of great results in our back, yet missing the perfect correspondence between Ramsey-type theorems and partition regular sets (or put differently, ultrafilters). I started out with the notion of (adequate) partial semigroups and now, I think, is the time to return to it.

I believe that a notion of partial semigroup could help solve this mysterious question. The goal of this entire series is to investigate a potential relationship between partial semigroups and central sets.

### Minimal partial semigroups

I tried to convince you earlier that FS-sets are partial semigroups. They are, in fact, the minimal partial subsemigroups of $(\mathbb{N},+)$. Why is this? It’s by a rather simple induction argument

• Say $(S,\cdot)$ is an adequate partial semigroup.
• Take any $s_0\in S$.
• Inductively, $FP(s_0,\ldots, s_n)$ has all products defined.
• Pick $s_{n+1} \in \bigcap_{t \in FP(s_0,\ldots,s_n)} \sigma(t)$ (remember those $\sigma(t)$?) — since $S$ was a partial semigroup, this intersection will never be empty.
• Then all finite products of the $(s_i : i\in \omega)$ are defined.
• In other words, $FP(s_i) \subseteq S$ in the fullest sense.

I don’t think I ever mentioned that “to include an FS-set” (or FP-set) has an established abbreviation; such sets are called IP-set.

Most people will find it important to point out that “IP” does not abbreviate “idempotent” (for idempotent ultrafilters) but was originally meant to abbreviate “infinite parallelepiped” (which makes sense if you think of FS-sets in vector spaces until you realize that this still means “includes an infinite parallelepiped”).

So it seems that we could add some general nonsense by saying “partial semigroup” instead of “IP-set/includes and FS-set”. Unfortunately, this is not the case. Even though every partial semigroup contains and FS-set, not every set that contains an FS-set has a (compatible) partial semigroup structure.

Oh dear, I just notice something and I hope I haven’t made this mistake too often. I’m talking about FS-sets and $(\mathbb{N},+)$ here. So when I write “partial semigroup” for $A\subseteq \mathbb{N}$, then I mean “partial subsemigroup” in the sense that the usual addition restricted to $A$ forms a partial subsemigroup (as is the case for FS-sets). Oh well, I guess this series is getting too long after all and I’m beginning to loose track of what I’ve already written. I hope you might just enjoy reading it anyhow.

## Weak partial semigroups

I think the key to partial semigroups will be to weaken the notion further. I think I will call those “weak partial semigroups” (even though I’d love to call them “very partial semigroups”…).

What I’m about to do is, I think, somewhat bad style. Let me alleviate this by some comments. Of course, the ideas in this series do not come out of nowhere. They stem from my studies in this area over the last 5 years (gee, has it really been that long?). Because of this, they are based on structures that you can frequently encounter in this area. So if you know the research area you will hopefully find all of this very familiar and think “why does he make such a fuss about this standard thing?”. That’s great! And if you’re not, rest assured there’s a method to my madness. I’m not sure I will get there anytime soon, but there’s still hope.

In my silly demonstration above that partial semigroups contain FS-sets you may have noticed that we didn’t really see strong associativity — especially, since we’re starting with a full semigroup anyway and have no worries about associativity.

What was however crucial is finite intersection property. And that’s already all there is to this “new” notion.

Weak partial (sub)semigroup If $(S,\cdot)$ is a semigroup, then $A\subseteq S$ is a weak partial subsemigroup, if the restriction of the partial semigroup operation to $A$ is adequate, i.e., the sets of the form
$$\sigma_A(a) := \{ b\in A : a\cdot b \in A\}$$ generate a filter.

Note: I just made up that $\sigma_A(a)$ notation to possibly help your understanding by making the connection to $\sigma(a)$.. I will get to a more classical formulation in a second.

In other words, the operation restricted to (a partial operation on) $A$ is, to some extent, a partial semigroup. We might not have the luxury of full associativity: a, b,c, abc, ab might be in such an A, but not bc (this actually happens in real life btw) — so we cannot compare $a(bc)$ with anything, in particular not with $(ab)c$.

But for any $a$ we will find many (a filter set of) $b$’s and for each of those $b$’s we will find a filter set of $c$’s such that $a,b,c,ab,bc, abc \in A$. That’s pretty good, don’t you think?

I’m not sure what I’ll do next. There are many paths to choose at this point. I’m not sure which one is best and I might just settle for the one that seems most easily self-contained. But don’t worry even if the next post is on a different topic.

# Hindman’s Theorem, partial semigroups and some of my most lacking intuitions (part 6)

I know, I know, it’s part 6 already. Last time I finally formulated the Central Sets Theorem. This part will just be a small bridge. But at least you’ll finally know why on earth I am writing this, i.e., what big open question I’m actually trying to build some intuitions for.

## Central Sets and ultrafilters

Before we can move on in this series I should tell you a little bit about the relationship between ultrafilters and central sets — and finally give you something close to a definition.

If you already believed me that idempotent ultrafilters exist, you might also believe me that there’s a special kind of idempotent ultrafilters, they are called minimal idempotents. The reason for the name “minimal” is a partial order on idempotents coming from ring theory. Yet again, we’ll skip the definition — I would first have to convince you that $\beta \mathbb{N}$ is actually a semigroup and, again, I don’t want to go there right now.

In any case, there are those idempotent ultrafilters which are minimal idempotents. As I mentioned before, Hillel Furstenberg had introduced the notion of central set via recurrence in dynamical systems. It took a couple of years until in 1990 Vitaly Bergelson and Neil Hindman helped establish that centrality can be framed extremely well in terms of ultrafilters.

Theorem (Bergelson, Hindman) $A\subseteq \mathbb{N}$ is central iff $A$ is in a minimal idempotent ultrafilter.

This was, I think, quite a crazy and beautiful result at the time and its simplicity is still stunning (although, arguably, you won’t agree since I didn’t give you the complicated definition in terms of dynamics).

This leaves us with the following situation:

• central sets and minimal idempotents correspond neatly and
• we have a Ramsey-type theorem (the Central Sets Theorem) that central sets fulfil

This is not as optimal as it could be! If you remember, we were even better off with Hindman’s Theorem:

A set is contained in an idempotent ultrafilter if and only if it “satisfies” Hindman’s Theorem (i.e., includes an FS-set).

In fact, a set is an FS-set if and only if it is contained in an idempotent ultrafilter.

Hindman’s Theorem and idempotent ultrafilters corresponded directly. The unfortunate situation for central sets is that (as far as I know) no version of the Central Sets Theorem is able to accomplish a correspondence of the form

Wanted “A set is included in a minimal idempotent ultrafilter (i.e., is a central set) if and only if it fulfills the following Central Sets Theorem”

This would be the dream, I think. And this is, in a manner of speaking, the whole point of this series. How can we get to this? As you may have guessed, I believe partial semigroups can help shed light on this. So I will return to them in the continuation.

# Hindman’s Theorem, partial semigroups and some of my most lacking intuitions (part 5)

Last time, I left you hanging — I promised the Central Sets Theorem, but only bothered you with some more stuff on partial semigroup, i.e., condensations. Let me make it up to you.

## The Central Sets Theorems

So what is this mysterious theorem that, even though ergodic in origin is so close to the heart of algebra in the Stone-Cech compactification? It is not easy to formulate, but luckily I kept bugging you with $\mathbb{F}$ and condensations, so we are well prepared.

Since there are many different versions of the theorem as it has been improved and generalized over the last 30 years, the most general formulations are quite something to process. As you might remember, this series started out as an expository piece, so let’s write down a very simple version first, even weaker than the very first version proved by Fürstenberg (but who knows, maybe that’s the version he first noticed). Remember that central sets are some kind of partition regular sets (that I haven’t even properly defined yet but who cares).

Central Sets Theorem (simple version) Let $A\subseteq \mathbb{N}$ be a central set. Imagine I’m giving you some abitrary $FS(x_n)$. Then you can find a $FS(a_n) \subseteq A$ and $(s_n: n\in \omega)$ in $\mathbb{F}$ such that
$$FS(a_n + \sum_{i\in s_n} x_i) \subseteq A.$$
Just to repeat: if we set $y_n = \sum_{i\in s_n} x_i$, then
$$FS(a_n + y_n) \subseteq A.$$

This is quite odd, no? Even though the sequence $(x_n : n\in \omega)$ has nothing to do with $A$, the central set can “gobble it up”. As is to be expected, $A$ cannot “gobble it” up all of it, but there’s a full condensation that can be translated into $A$ in a rather peculiar fashion. Somehow, $A$ is able to reproduce a shifted version of the algebraic structure of the FS-set.

Well, it gets even better. The original theorem allows us to “iterate” this result in a strong way.

Central Sets Theorem (less simple version) Let $A$ be a central set. This time, I’m giving you finitely many $FS(x_n^0),\ldots FS(x_n^k)$. Not only can you get the above version for each sequence, but you can get them “together”:

You can find a single $FS(a_n) \subseteq A$ and a single $(s_n: n\in \omega)$ in $\mathbb{F}$ such that for all $j < k$ simultaneously $$FS(a_n + \sum_{i\in s_n} x_i^j) \subseteq A.$$
Again repeat: if we set $y_n^j = \sum_{i\in s_n} x_i^j$, then for all $j<k$
$$FS(a_n + y_n^j) \subseteq A.$$

The crucial strength is that all the $y_n^j$ are constructed using the one fixed sequence of $s_n$’s — regardless of the given sequences! That’s crazy! Even though the sequences $(x_n^j)$ are completely unrelated, we will find condensations uniformly and translate by the same sequence of $a_n$’s.

Ok, maybe this difference is a bit subtle at first, but it is quite potent. For example, we immediately get van der Waerden’s Theorem!

van der Waerden’s Theorem (sort of) If $A$ is central and some $l\in \omega$ given, we can find an an arithmetic progression of length $l$ in $A$, i.e., there exists $a\in A$ and $d\in \mathbb{N}$ such that $$a, a+d, \ldots, a+ l\cdot d \in A.$$
In fact, we can find the increment $d$ in any prescribed FS-set!

This last sentence is kind of cool: you want the increment to be a multiple of 10? 42? A gazillion? No problem!

Let’s derive this from the Central Sets Theorem. I know it’s a bit meaningless without having seen the proof of the Central Sets Theorem (which is, by the way, very elegant using ultrafilters and I hope I’ll get around to it)

Proof. Well, the “in fact” should be taken as a hint. So I give you some length $l$ for the arithemetic progression and as well as one FS-set, say $FS(x_n)$; you want to find an arithmetic progression in $A$ by some increment $d\in FS(x_n)$.

In other words, you’re trying to make $d, 2d, \ldots, d \cdot l$ work. The idea is to look at the sequences $(j\cdot x_n: n \in \omega)$ to apply the version of the Central Sets Theorem that allows for finitely many FS-sets.

What does it give you? Well, you get a lot of $a\in A, s\in \mathbb{F}$ (a sequence/FS-/FP-set whatever, but let’s just take one of each) such that
$$a + \sum_{i \in s} j\cdot x_i = a +j \sum_{i\in s} x_i \in A$$ for each $j < l$ — oh but that’s an arithmetic progression with increment $d= \sum_{i\in s} x_i$, so we’re done!

Look at how much more we have, though. We can predescribe the FS-set to pick from, we get an FS-set (induced by the FP-set) we get an FS-set of starting points and so forth.

This was but a first taste. The Central Sets Theorem can be pushed much further. With some restrictions a central set can “gobble up” infinitely many FS-sets, it can be proved for commutative semigroups and, in a much more complicated version, for non-commutative semigroups. But we’ll stop here for now.

# Hindman’s Theorem, partial semigroups and some of my most lacking intuitions (part 4)

Well, I hope you didn’t miss me while I was on my first summer vacation in three years. So let’s continue this series. If you remember, part 3 consisted mainly of the observation that FS-sets have a partial semigroup structure induced by $\mathbb{F}$ as well as me telling you that there’s a immediate correspondence between FS-sets and idempotent ultrafilters.

I’m slowly getting where I wanted to head all along with this series. When I write “my most lacking intuitions” in the title, I have my intuitions about central sets in mind. They are most lacking, I assure you. But with this series I wanted to clear my head a little. So let’s head down the rabbit hole, no questions asked.

## Towards the Central Sets Theorem

The Central Sets Theorem was conceived by Hillel Fürstenberg. I know relatively little about its history so I am still amazed by the fact that anyone could come up with it — it’s such a strange creature. Fürstenberg proved it for $\mathbb{Z}$ (but I’ll keep considering $\mathbb{N}$, the two situations are equivalent anyway). The notion of centrality has its origins in ergodic theory — unsurprisingly for Furstenberg. As fascinating and fruitful as the connection between ergodic Ramsey theory and Algebra in the Stone-Cech compactification is, I don’t plan to introduce the technology in this series, because it will take us too far from the path I have in mind (mind you, I haven’t even introduced the semigroup structure on $\beta \mathbb{N}$, so really, I shouldn’t introduce the ergodic point of view of which I know far less).

Fürstenberg devised the notion of central set (for subsets of $\mathbb{N}$) which was determined by recurrence phenomena in dynamical systems. Again, I don’t want to discuss the dynamical point of view but I’ll give the ultrafilter characterization later. The key for a connection to Ramsey theory was simple.

Theorem (Fürstenberg) Central sets (whatever they are) are partition regular.

Whatever central sets are (sorry for being temporarily mysterious), it is shocking how much algebraic structure central sets have to offer — which is what the Central Sets Theorem is all about. It took a while and until Neil Hindman and Vitaly Bergelson made the connection between the algebraic/ultrafilter side and the ergodic side apparent. Nevertheless, the precise level of the algebraic closure of central sets is still a mystery. And this mystery is the reason for this series.

## A detour: FP-sets and their condensations

One thing we should do before formulating the theorem is the following. What does an FS-set mean in the context of $\mathbb{F}$? Generally speaking, in a (partial) semigroup we have FP-sets (“finite products” instead of “finite sums”): given a sequence $(x_n: n\in \omega)$ we write

$$FP(x_n) = { \prod_{i \in s} x_i: s \in \mathbb{F} }.$$

Of course, in the partial semigroup scenario, we should also think of this as a statement restricted to defined products. However, usually (e.g., in Hindman’s Theorem) all products will be defined. Also, in the non-commutative case (which I wholeheartedly ignore in this series), this notation is supposed to be read “in order”, i.e., the products are in the natural order of $s \subseteq \omega$.

For a crucial example, consider $\mathbb{F}$ itself.

If we have a sequence $(s_n : n\in \omega)$ in $\mathbb{F}$ such that the $s_n$ are pairwise disjoint, then the FP-set will be just fine — all products are defined in our partial semigroup.

This is a critical example also because we can transfer such partial subsemigroups easily: If $FP(s_n) \subseteq \mathbb{F}$ and we have some $FS(x_n) \subseteq \mathbb{N}$, then we can consider $y_n := \sum_{i\in s_n} x_i$. As long as the $s_n$ are pairwise disjoint, we get

$$FS(y_n) \subseteq FS(x_n).$$

So we have a partial subsemigroup of $FS(x_n)$ induced by a partial subsemigroup of $\mathbb{F}$!

And this is actually a typical phenomenon thanks to Hindman’s Theorem. Remember,

Hindman’s Theorem If we partition an FS-set into finitely many pieces, one piece will contain an FS-set.

### Condensations and Intuitions

There’s one important question when it comes to developing an intuition: what should we expect when we partition again and again? One typical phenomenon is the following: if we take some $FS(x_n)$ and partition into two pieces where one piece contains exactly the elements of the generating sequence ${ x_n : n \in \omega }$. By Hindman’s Theorem, one part of the partition will contain an FS-set — but that’s not going to be ${ x_n : n \in \omega }$! Consider the case $x_n = 2^n$, then ${ 2^n : n \in \omega }$ certainly does not contain an FS-set, it does not even satisfy Schur’s Theorem!

So what happens in this case? Well, we can easily describe many FS-sets in the other part of the partition; e.g., take every other generator and add: $y_n= x_{2n} +x_{2n+1}$. Then $FS(y_n)$ is good for the second part of the partition.

More generally, we could take any pairwise disjoint $(s_n : n\in \omega)$ in $\mathbb{F}$, just make sure that no $s_n$ is a singleton. Then as above, $FP(s_n)$ induces an FS-subset of $FS(x_n)$ — which will lie completely in the “large” part of the partition.

A word of caution: in a certain sense, partitions as the one above are unusually simple because one part does not contain an FS-set. In general, you should expect all parts to contain FS-sets (for example, when separating different idempotent ultrafilters). Nevertheless, I would say that a huge chunk of arguments regarding strongly summable ultrafilters relies on such “simple” partitions — so they are extremely useful.

The point I’m trying to make is that whenever we repeatedly partition an FS-set, Hindman’s Theorem will give us homogeneous FS-sets — but you should expect the elements in the generating sequence to be sums of many elements of the original generators!

This is why a sequence $(y_n: n \in \omega)$ in some $FS(x_n)$ with $FS(y_n)\subseteq FS(x_n)$ is called a condensation of $(x_n :n \in \omega)$ — because generally speaking many elements from $x_n$ are condensed into one of the $y_n$ (of course, some $x_n$ might just be dropped completely). The term “condensation” is also used in arbitrary (partial) semigroups.

Oh and to be absolutely clear:

Sequences in $\mathbb{F}$ are always assumed to be pairwise disjoint (so that they are pairwise compatible in our partial operation)

Alright, that’s enough for this part, I think. Next time, I’ll finally talk about the Central Sets Theorem.

# Hindman’s Theorem, partial semigroups and some of my most lacking intuitions (part 3)

Ok, time for part 3. We’re not close to an end but I must apologize that I won’t be able to post in the next week. But let’s recap. In the first part I simply explained why semigroups are not partition regular and in the second part I wrote about FS-sets and, finally, introduced partial semigroups. One of the headings promised that partial semigroups will help us talk about FS-sets.

## Are we there yet?

At this point you might also ask why I talked about partial semigroups so much when I wanted to give you an easy description of FS-sets.

Well, it’s because I did. FS-sets are partial semigroups, in fact, they are a lot like $\mathbb{F}$!

Why? Because for a sequence $(x_i: i < N)$ (for some $N\leq \omega$)

$$FS(x_n) := { \sum_{i \in s} x_i: s \in \mathbb{F} }.$$

So we have an induced partial operation on $FS(x_n)$.

A word of warning: an FS-set can have more structure than $\mathbb{F}$ induces — just look at $FS(2^n) (= \mathbb{N})$ or in a different form, look at $FS(n)$ — same set, but much messier when it comes to the relationship between the generating sequence and the sets of finite sums!

In general, we cannot assume that each element of an FS-set has a unique description via some $s\in \mathbb{F}$ nor can we assume that the $\mathbb{F}$-induced operation captures all allowable sums. So, really, the whole affair is far from optimal!

Nevertheless, if we look at the partial semigroup operation on an FS-set induced by $\mathbb{F}$, we get a very natural partial semigroups structure with plenty of structure, even if we don’t catch all aspects.

From this point of view, we could re-phrase Hindman’s Theorem as follows.

Hindman’s Theorem (bad version) Partial semigroups are partition regular.

This is, of course, a most horrible example of general nonsense: hiding a concrete structure by using an abstract notion. We’re not getting an arbitrary kind of partial semigroup, we’re getting FS-sets, plain and simple.

The reason why I am writing this exposition is that, as much as I believe the preceding paragraph, I think there could something valuable in this formulation: if we could develop the notion of partial semigroup, we might just end up solving one of the big unknowns in this field. But before I can take you in that direction, we need to talk about ultrafilters.

## Ultrafilters on semigroups

When it comes to infinite (and sometimes even finite) Ramsey-type theorems, it is often easy (or at least nice) to give a proof via ultrafilters. The reason is very simple: if we partition a set into finitely many piece, an ultrafilter contains exactly one of the pieces. This allows for a simplistic strategy: construct the right kind of ultrafilter and it will do all the work for you!

In case of FS-sets and Hindman’s Theorem there is an even closer relation to ultrafilters, idempotent ultrafilters to be exact. I don’t want to prove anything here, so whatever idempotent ultrafilters are, they are a special kind of ultrafilter on semigroups (but one that exists in ZFC). They come up in the most popular proof of Hindman’s Theorem, the so-called Galvin-Glazer Theorem (so-called since neither of them were involved in getting the proof published and the way I’ll write it it’s really Galvin’s).

Galvin-Glazer Theorem If $p \in \beta \mathbb{N}$ is an idempotent ultrafilter, $A\in p$, then $A$ contains an FS-set.

Hindman’s Theorem is then an immediate corollary.

I love the odd history of Hindman’s Theorem, so allow me to digress. Around 1970 Galvin actually had the proof but didn’t know if idempotent ultrafilters existed (and he had a different name because he was thinking about them the “wrong” way). Then in 1972 Hindman used CH and, assuming “his” theorem, built an ultrafilter as Galvin needed. Then in 1974 Hindman proved “his” theorem combinatorially. Yet in 1975 Galvin met Glazer who told him that idempotents exist (which, looked at the “right” way had been known since the 1958!).

Coming back to the relation between FS-sets and ultrafilters, there’s also a reverse:

If $A$ contains an FS-set, then $A$ is in an idempotent ultrafilter.

In other words, $A$ contains an FS-set iff $A$ is in an idempotent ultrafilter.

Again, the proof is not important right now (don’t worry I’ll probably get back to it later). What is important is that, in this sense, the structure of FS-sets and idempotent ultrafilters is immediately connected.

And the point, in fact the point of this whole series, is that this relationship is missing for stronger algebraic Ramsey Theorems.

If you read this far, my thanks. I won’t be able to post anything next week, but there’s more to follow (and written but not yet prepped for posting). So stay tuned!

# Hindman’s Theorem, partial semigroups and some of my most lacking intuitions (part 2)

Yesterday I finally started this short series with some small thoughts regarding partial semigroups. If you don’t remember, all I did yesterday was to explain why semigroups are not, in general, partition regular using a simple partition of $\mathbb{N}$. I was trying to keep your hopes up by pointing out Schur’s Theorem.

## So how far can we push?

There are essentially two paths to choose from here. The next step in a decent text book would probably tell you about van der Waerden’s Theorem (1927), Rado’s Theorem (1970) or the Folkman-Rado-Sanders Theorem (Sanders being the first to publish it in his PhD thesis (19691968, see Jon Sanders’s comment below) but usually the last to be mentioned because Folkman had some “private communication” with Graham and Rothschild).

The first two theorems lead down a different path, towards partition regular matrices which, from the point of view of Schur’s Theorem, could be described as “getting more linear combinations”. The last one on the other hand could be described as “getting some algebraic closure” and this is where we came from in the last post. But I would like to skip ahead or else we’ll still be sitting here tomorrow next week.

Let me get back to talking to you about Hindman’s Theorem. Skipping its curious history, let’s see where Schur left us. We have sums of two elements — very often. But what about three? Or any finite number? Well, we know that repetition is a problem — after all, that’s what prevented subsemigroups from being a partition regular notion. So we have to drop repetitions. What does that leave us with? Distinct sums.

FS-set Given a sequence $(x_n: n<N)$ (with $N \leq \omega$, but we’re really intersted in $\omega$) let’s denote the set of distinct finite sums of elements from this sequence by $FS(x_n)$, i.e., $$FS(x_n) = \{ x_{i_0} + \ldots x_{i_k} : i_0, \ldots i_k<N \mbox{ pairwise different} \}.$$

So we collect all results of summing elements from our sequence without doing any repetitions. Fair enough. This seems to be as far as we can get without getting at an actual subsemigroup.

If we’re really trying to look at this structure, we should do a reality check. How rich are such sets? Well, first of all, $FS(2^n) = \mathbb{N}$, so, yeah, they can be pretty rich. On the other hand, you have the potential of huge gaps — $FS(10^n)$ has $1, 10, 11, 100, 101, 110, 111, 1000$ and so forth — immense gaps will appear, but at each new element from the sequence, we’ll see a large cluster of numbers at relatively close distance. That’s not bad given our counterxample for subsemigroups.

The surprising thing is that this property — to contain an FS-set — is actually partition regular. And this is exactly what Hindman showed in 1974.

Hindman’s Theorem FS-sets are partition regular.

What a great theorem! All we had to do was drop repetitions and we immediately get partition regularity. That’s a good deal. And yet, it begs the question: can we formalize the kind of algebraic closure appearing in an FS-set?

## Is there an easy way to describe FS-sets?

The thing with FS-sets is that they look awefully like subsemigroups. You can almost always add two elements together. And not just those from the underlying sequence. Given an element of an FS-set, all but finitely many elements for the underlying sequence can be added to it.

But also when we have two different sums chances aren’t too bad that we can add them — the only restriction is that the two sums must have been built from pairwise different elements from the underlying sequence.

Thankfully, we can describe this messy business efficiently by introducing one of my favorite and one of the most important structures: $\mathbb{F}$.

What is $\mathbb{F}$? Following other researchers (especially Andreas Blass) I will denote the finite non-empty subsets of $\omega$ by $\mathbb{F}$.

$$\mathbb{F} := [\omega]^{<\omega} \setminus \{ \emptyset \}.$$

Why on earth would I introduce a new notation for such a standard set-theoretic object, you ask? Well, mostly because I didn’t — others did. But it makes sense becaus our interest is in a certain algebraic structure on $\mathbb{F}$. (Personally, I also like the blackboard boldface F — it’s pretty.)

So what structure? There already exists a very well known and important algebraic structure on $[\omega]^{<\omega}$ — the group operation of symmetric difference $\Delta$. This operation gives us the countable Boolean group: every element is its own inverse, i.e., we’re talking about $\otimes_{i \in\omega} \mathbb{Z}_2$ here.

On the other hand, $[\omega]^{<\omega}$ has a very (I mean very) natural semigroup operation: taking the union, $\cup$. This operation is, of course, as far away from being a group operation as seems possible; every element is idempotent!

It’s not historically relevant (as far as I know), but the part that is interesting here is the “intersection” of these operations, i.e., when these two different operations agree.

Yes, I am making all this fuss just to talk about disjoint unions. You’ll say, that’s ridiculous! Disjoint unions are easy! Well, you’re right. And that’s exactly the point, actually. Addition is such a pain! It is subtle and messy (all that carrying over). Disjoint unions are soooo much easier, right?

(Right. Just you wait.)

So what about disjoint unions? Isn’t that a pretty tough restriction? If I was trying to do the usual mathematicians-love-obfuscation thing, I could probably make big deal out of the fact that restricting our operations to disjoint unions eliminates all multiples (which is, like, super important, because I was, like, talking about that all the time, right??? — that is, of course, getting it backwards, so don’t think that way, please!).

The thing that is interesting is that disjoint unions still have a kind of associativity law (warning: I’m going to use $\cdot$ now — don’t freak out — but we’re not in Kansas in $(\mathbb{N},+)$ anymore)

$$(a \cdot b) \cdot c = a \cdot (b \cdot c)$$
in the sense that, if one side is defined, so is the other and they are equal.

Of course, “they are equal” is not the point here — after all, we have restricted a (semi)group operation — so for us the key should be: if one side is defined then so is the other. In other words, our operation is relatively rich.

This is what Bergelson, Blass and Hindman dubbed “strong associativity” for in their seminal paper. Altogether, a set with a partial operation satisfying strong associativity, they called a partial semigroup.

We have a slight problem though: the empty operation would vacuously be strongly associative, so “rich” was really an exaggeration. That’s why you’d always want to assume that there’s really something going on. A pretty natural richness condition (which disjoint unions have) is the following.

Given finitely many elements, there’s another one that can be multiplied to any of them.

We might not be able to multiply two elements, but at least each element has some of elements it can be multiplied with.

What we’re looking at is simply a finite-intersection-property. Namely, the family of sets “can be multiplied with $x$” has the finite intersection property. (What we really want is the infinite finite intersection property, but that’s not too important right now.) For historical reference, if we have this, it’s called a adequate partial semigroup — and we’re not interested in anything else.

Technically, we will need to say from where we multiply. I usually prefer “from the right” and write the set as $\sigma(x) := \{ y : x\cdot y \mbox{ defined } \}$.

## Are you messing with me (again)?

After I have forced this new notion upon you, here’s the thing: every partial semigroup can be extended to a semigroup. All you have to do is to add one element, say $U$, which takes all the “undefined” values and otherwise acts as a multiplicative zero, i.e., $a \cdot U = U \cdot a = U$. It’s easy to check that, thanks to strong associativity, this operation is really associative.

What’s the point then? Partial semigroups have a huge advantage: they allow us to focus on the part of our algebraic semigroup structure that is important for us. Just like disjoint unions are the important structure for $\mathbb{F}$ because they eliminate, e.g., the irrelevant idempotency of individual elements under $\cup$. In addition, we will see that partial subsemigroups are much more abundant and, in my eyes, one key for understanding Hindman’s Theorem and its generalizations.

In the next part I will try to convince you that partial semigroups are indeed a good answer to the question “What kind of of structure is flexible enough to be survive partitioning”?

# Hindman’s Theorem, partial semigroups and some of my most lacking intuitions (part 1)

If you remember, I mentioned that I was working on a post on some research and it was getting out of hand. Well, it is still not finished, but long enough to start posting a series of posts.

It’s no secret that I love the mathematical world surrounding Hindman’s Theorem. Recently, I have been revisiting an old draft of mine. To get back into it, I want to jot down some informal notes about one of the research directions I think are worth pursuing — even though I have no proof of this (pardon the pun). At the heart of this line of thought lies the notion of partial semigroups and, especially, my ideas for weaker forms of that notion. To fully make sense of it I have to take you from the elementary algebraic structures to less known structures to ultrafilters on those and finally to filters related to all of this.

But let me start at the beginning so that you can still enjoy reading a little bit of what I want to express without being lost after 5 minutes.

Hindman’s Theorem is a typical Ramsey-type theorem, i.e., it tells us about a certain richness in some fixed mathematical structure (such as a graph in the original theorem by Ramsey), a richness which is partition regular:

Partition regular notions of richness If the structure is partitioned in to finitely many pieces (equivalently colored with finitely many colors), one piece (color) will have this kind of richness.

More concretely, Hindman’s Theorem is about algebraic structures, namely, semigroups. Semigroups, if you don’t remember, are simply sets with an associative operation

$$(a \cdot b) \cdot c = a \cdot (b \cdot c).$$

## semigroups? that’s easy! subsemigroups!

If you are looking for partition regular notions and you meet a semigroup, you can, of course, ask yourself:

Are semigroups partition regular?

(Un)fortunately, that’s not going to work.

To give you a counterexample, let’s look at the natural numbers, $\mathbb{N}$, with the usual addition. Unlike other research into semigroups, almost all of the work in this Ramsey-theoretic area is connected to $\mathbb{N}$.

Ah, I just noticed: I have made a mistake. If you’re a set theorist and $\mathbb{N} = \omega$, then actually, this is going to work because $0$ generates a subsemigroup ($\{0\}$), so whenever we partition $\omega$, one part will include a subsemigroup. (more generally, monoids will do this and even more generally, semigroups with idempotent elements.)

Oh well. But that’s “clearly” not the point (especially if you’re a set theorist). You’ll most likely be interested in large, hopefully infinite structures — and in any case, you’re not interested in trivial solutions by idempotent elements…

To get back to our example, we’ll ignore $0$ by adopting the convention most people in this area use:

$$\mathbb{N} = \omega \setminus \{0\}$$

## subsemigroup #Fail.

Subsemigroups have a nice property: they contain sets of ‘multiples’. Given any element in the subsemigroup, we find all its ‘multiples’ since we can add it to itself repeatedly. Of course, for $(\mathbb{N},+)$ this really means we’re talking about multiples in the natural sense.

If we have (thanks to the heading of this section) the suspicion that semigroups are not partition regular, we need to find a partition such that no part contains a set of multiples.

When I was lying in bed a couple of days ago, falling asleep but trying to put the ideas together for this post , it took me a while to come up with a partition. Since at the end I was actually asleep, I don’t really remember how I got there. So I’m afraid I can’t lead you to it, cannot communicate what intuitions about partitions of natural numbers came up in the process. If I would venture a guess, I probably just remembered the solution for a more complicated (i.e., weaker) structure. Not a grand revelation, I admit.

So how do we do this? Well, the thing about multiples is that they are evenly spaced. So if the parts of our partition have a large interval missing, say larger than some given number $a$, then that number’s multiples will ‘hit’ that interval, i.e., there’ll be a multiple of $a$ in that interval, not in our part of the partition.

So if we partition $\mathbb{N}$ into quickly increasing intervals, say for simplicity

$$[1,10), [10,100), [100,1000) \ldots$$

we can do the following: Collect every other interval, i.e.,

$$A_0 := \bigcup_{i\in \omega} [10^{2i},10^{2i+1})$$

and make $A_1$ its complement $A_1 = \bigcup_{i\in \omega} [10^{2i+1},10^{2i+2})$.

Then both $A_0$ and $A_1$ have increasingly large gaps, arbitrarily large gaps.

Now if we had a number in $A_0$, then it will lie in one of the intervals. But it’s multiples cannot skip the next interval — it’s far too big for that. I mean, if you start in $[100,1000)$, then you’re best chance really is $999$ — and of course $1998$ is far from being in the interval $[10000,100000)$… Of course the same argument goes through for any other number (and also for $A_1$), just look at the highest multiple still in the interval and then double it.

In other words, for any given number, not all multiples lie in the same part of the partition, i.e., neither $A_0$ nor $A_1$ contain a set of multiples — in particular, neither contains a subsemigroup of $\mathbb{N}$ — #Fail.

## I’m not just messing with you

Now you can ask yourself if this hope for partition regular structures within semigroups was all but a dream and there’s simply no algebraic structure that survives.

Luckily there’s evidence dating back all the way to 1917 guaranteeing that at least a tiny little bit of algebraic structure will survive partitioning. Issai Schur proved that if you partition $\mathbb{N}$ into finitely many pieces, you will always find two numbers $x,y$ such that $x, y$ and $x+y$ are in the same piece. (Of course, you’ll find infinitely many such pairs.)

It’s not important, but the proof is quite simple actually, using Ramsey’s Theorem (the original thing). Given a coloring of $\mathbb{N}$, use that coloring to define a coloring of all unordered pairs of natural numbers: give $\{x,y\}$ (with $x < y$) the color of $y-x$. By Ramsey’s Theorem, there exists a homogeneous infinite set. Pick any three numbers $x < y < z$ in that homogeneous set. Then $z – y, z – x$ and $y – x$ all have the same color — which solves our problem, since $z-y + y-x = z-x$ — and notice how associativiy comes into play.

### outlook

In the next post, I’ll try to point out how far we can push this search for algebraic structures.