Sunday 25 March 2012

CHAs II: The Hopf Algebra SSym of Permutations


Last time we looked at YSym, the (dual of the) Loday-Ronco Hopf algebra of binary trees. This time I want to look at SSym, the Malvenuto-Reutenauer Hopf algebra of permutations. (In due course we'll see that YSym and SSym are quite closely related.)

The fundamental basis for SSym is (indexed by) the permutations of [1..n] for n <- [0..]. As usual, we work in the free vector space over this basis. Hence a typical element of SSym might be something like [1,3,2] + 2 [4,1,3,2]. Here's some code:

newtype SSymF = SSymF [Int] deriving (Eq)

instance Ord SSymF where
    compare (SSymF xs) (SSymF ys) = compare (length xs, xs) (length ys, ys)

instance Show SSymF where
    show (SSymF xs) = "F " ++ show xs

ssymF :: [Int] -> Vect Q SSymF
ssymF xs | L.sort xs == [1..n] = return (SSymF xs)
         | otherwise = error "Not a permutation of [1..n]"
         where n = length xs


(The "F" in SSymF stands for fundamental basis: we may have cause to look at other bases in due course.)

Let's try it out:

$ cabal update
$ cabal install HaskellForMaths
$ ghci
> :m Math.Algebras.Structures Math.Combinatorics.CombinatorialHopfAlgebra
> ssymF [1,3,2] + 2 * ssymF [4,1,3,2]
F [1,3,2]+2F [4,1,3,2]

Ok, so how can we define an algebra structure on this basis? How can we multiply permutations? (We want to consider permutations as combinatorial rather than algebraic objects here. So no, the answer isn't the group algebra.) One possibility would be to use the following shifted concatenation operation:

shiftedConcat (SSymF xs) (SSymF ys) = let k = length xs in SSymF (xs ++ map (+k) ys)

For example:

> shiftedConcat (SSymF [1,2]) (SSymF [2,1,3])
F [1,2,4,3,5]

This has the required properties. It's associative:
> quickCheck (\x y z -> shiftedConcat (shiftedConcat x y) z == shiftedConcat x (shiftedConcat y z))
+++ OK, passed 100 tests.

And it's pretty obvious that the empty permutation, SSymF [], is a left and right identity. Hence we could form a monoid algebra using this operation.

However, for the Hopf algebra we're going to look at, we will use a slightly more complicated multiplication. We will retain the idea of shifting the second permutation, so that the two lists are disjoint. However, instead of just concatenating them, we will form the sum of all possible "shuffles" of the two lists. Here's the shuffle code:

shuffles (x:xs) (y:ys) = map (x:) (shuffles xs (y:ys)) ++ map (y:) (shuffles (x:xs) ys)
shuffles xs [] = [xs]
shuffles [] ys = [ys]

So shuffles takes two input "decks of cards", and it outputs all possible ways that they can be shuffled together, while preserving the order between cards from the same deck. For example:

> shuffles [1,2] [3,4,5]
[[1,2,3,4,5],[1,3,2,4,5],[1,3,4,2,5],[1,3,4,5,2],[3,1,2,4,5],[3,1,4,2,5],[3,1,4,5,2],[3,4,1,2,5],[3,4,1,5,2],[3,4,5,1,2]]

Notice how in each of the output shuffles, we have 1 before 2, and 3 before 4 before 5. This enables us to define an algebra structure on permutations as follows:

instance (Eq k, Num k) => Algebra k SSymF where
    unit x = x *> return (SSymF [])
    mult = linear mult'
        where mult' (SSymF xs, SSymF ys) =
                  let k = length xs
                  in sumv [return (SSymF zs) | zs <- shuffles xs (map (+k) ys)]

For example:

> ssymF [1,2] * ssymF [2,1,3]
F [1,2,4,3,5]+F [1,4,2,3,5]+F [1,4,3,2,5]+F [1,4,3,5,2]+F [4,1,2,3,5]+F [4,1,3,2,5]+F [4,1,3,5,2]+F [4,3,1,2,5]+F [4,3,1,5,2]+F [4,3,5,1,2]

It's clear that ssymF [] is indeed a left and right unit for this multiplication. It's also fairly clear that this multiplication is associative (because both shifting and shuffling are). Let's just check:

> quickCheck (prop_Algebra :: (Q, Vect Q SSymF, Vect Q SSymF, Vect Q SSymF) -> Bool)
+++ OK, passed 100 tests.

(The test code isn't exposed in the package, so you'll have to dig around in the source if you want to try this. It takes a minute or two, because every time we multiply we end up with lots of terms.)


What about a coalgebra structure? As I mentioned last time, in a combinatorial Hopf algebra, the comultiplication is usually a sum of the different ways to take apart our combinatorial object into two parts. In this case, we take a permutation apart by "deconcatenating" it (considered as a list) into two pieces. This is like cutting a deck of cards:

deconcatenations xs = zip (inits xs) (tails xs)

For example:

> deconcatenations [2,3,4,1]
[([],[2,3,4,1]),([2],[3,4,1]),([2,3],[4,1]),([2,3,4],[1]),([2,3,4,1],[])]

However, most of those parts are no longer permutations of [1..n] (for any n), because they are missing some numbers. In order to get back to permutations, we need to "flatten" each part:

flatten xs = let mapping = zip (L.sort xs) [1..]
        in [y | x <- xs, let Just y = lookup x mapping]

For example:

> flatten [3,4,1]
[2,3,1]

Putting the deconcatenation and flattening together we get the following coalgebra definition:

instance (Eq k, Num k) => Coalgebra k SSymF where
    counit = unwrap . linear counit' where counit' (SSymF xs) = if null xs then 1 else 0
    comult = linear comult'
        where comult' (SSymF xs) = sumv [return (SSymF (flatten us), SSymF (flatten vs))
                                        | (us, vs) <- deconcatenations xs]

For example:

> comult $ ssymF [2,3,4,1]
(F [],F [2,3,4,1])+(F [1],F [2,3,1])+(F [1,2],F [2,1])+(F [1,2,3],F [1])+(F [2,3,4,1],F [])

(Recall that the result should be read as F[]⊗F[2,3,4,1] + F[1]⊗F[2,3,1] + ...)

It's fairly clear that this comultiplication is coassociative. The counit properties are equally straightforward. Hence:

> quickCheck (prop_Coalgebra :: Vect Q SSymF -> Bool)
+++ OK, passed 100 tests.

Is it a bialgebra? Do the algebra and coalgebra structures commute with one another? In previous posts I've been a bit hand-wavy about this, so let's take a short time out to look at what this actually means. For example, what does it mean for mult and comult to commute? Well, it means that the following diagram commutes:

But hold on a sec - what is the mult operation on (B⊗B)⊗(B⊗B) - is B⊗B even an algebra? And similarly, what is the comult operation on B⊗B - is it even a coalgebra?

Yes they are. Given any algebras A and B, we can define an algebra structure on A⊗B via (a1⊗b1) * (a2⊗b2) = (a1*a2)⊗(b1*b2). In code:

instance (Eq k, Num k, Ord a, Ord b, Algebra k a, Algebra k b) => Algebra k (Tensor a b) where
    unit x = x *> (unit 1 `te` unit 1)
    mult = (mult `tf` mult) . fmap (\((a,b),(a',b')) -> ((a,a'),(b,b')) )

Similarly, given coalgebras A and B we can define a coalgebra structure on A⊗B as follows:

instance (Eq k, Num k, Ord a, Ord b, Coalgebra k a, Coalgebra k b) => Coalgebra k (Tensor a b) where
    counit = unwrap . linear counit'
        where counit' (a,b) = (wrap . counit . return) a * (wrap . counit . return) b
    comult = nf . fmap (\((a,a'),(b,b')) -> ((a,b),(a',b')) ) . (comult `tf` comult)

(Recall that those pesky wrap and unwrap calls are the isomorphisms k <-> Vect k (). What the counit definition really says is that counit (a⊗b) = counit a * counit b.)

Notice how in both the mult and comult definitions, we have to swap the middle two terms of the fourfold tensor product over, in order to have something of the right type.


So how does this work for SSym? Well, we start in the top left with SSym⊗SSym. You can think of that as two decks of cards.

If we go along the top and down the right, then we first shuffle the two decks together (in all possible ways), and then deconcatenate the results (in all possible ways).

If we go down the left and along the bottom, then we first deconcatenate each deck independently (in all possible ways), and then shuffle the first parts of both decks together and separately shuffle the second parts together (in all possible ways).

You can kind of see that this is going to lead to the same result. It's just saying that it doesn't matter whether you shuffle before you cut or cut before you shuffle. (There's also shifting and flattening going on, of course, but it's clear that doesn't affect the result.)

For a bialgebra, we require that each of unit and mult commutes with each of counit and comult - so four conditions in all. The other three are much easier to verify, so I'll leave them as an exercise.

Just to check:

> quickCheck (prop_Bialgebra :: (Q, Vect Q SSymF, Vect Q SSymF) -> Bool)
+++ OK, passed 100 tests.

Okay so what about a Hopf algebra structure? Is there an antipode operation?

Well, recall that when we looked last time at YSym, we saw that it was possible to give a recursive definition of the antipode map. This isn't always possible. The reason it was possible for YSym was:
- We saw that the comultiplication of a tree t is a sum of terms u⊗v, where with the exception of the term 1⊗t, all the other terms have a smaller tree on the right hand side.
- We saw that the counit is 1 on the smallest tree, and 0 otherwise.

It turns out that these are instances of a more general concept, of a graded and connected coalgebra.
- A graded vector space means that there is a concept of the size or degree of the basis elements. For YSym, the degree was the number of nodes in the tree. A graded coalgebra means that the comultiplication respects the degree, in the sense that if comult t is a sum of terms u⊗v, then degree u + degree v = degree t.
- A connected coalgebra means that the counit is 1 on the degree zero piece, and 0 otherwise.

(There is a basis-independent way to explain this, for the purists.)

Now, SSym is also a graded connected coalgebra:
- the degree of SSymF xs is simply length xs. Comultiplication respects the degree.
- counit is 1 on the degree zero piece, 0 otherwise.

Hence we can again give a recursive definition for the antipode:

instance (Eq k, Num k) => HopfAlgebra k SSymF where
    antipode = linear antipode'
        where antipode' (SSymF []) = return (SSymF [])
              antipode' x@(SSymF xs) = (negatev . mult . (id `tf` antipode) . removeTerm (SSymF [],x) . comult . return) x

For example:

> antipode $ ssymF [1,2,3]
-F [3,2,1]
> antipode $ ssymF [1,3,2]
-F [2,1,3]-F [2,3,1]+F [3,1,2]
> antipode $ ssymF [2,1,3]
-F [1,3,2]+F [2,3,1]-F [3,1,2]

It is possible to give an explicit expression for the antipode. However, it's a little complicated, and I haven't got around to coding it yet.

Let's just check:

> quickCheck (prop_HopfAlgebra :: Vect Q SSymF -> Bool)
+++ OK, passed 100 tests.

However, this isn't quite satisfactory. I'd like a little more insight into what the antipode actually does.

Recall that the definition requires that (mult . (id ⊗ antipode) . comult) == (unit . counit).

The left hand side is saying:
- first cut (deconcatenate) the deck of cards into two parts (in all possible ways)
- next apply the antipode to just one of the two parts
- finally shuffle the two parts together (in all possible ways)

The right hand side, remember, sends the empty permutation SSymF [] to itself, and every other permutation to zero.

So what this is saying is, you cut a (non-empty) deck of cards, wave a wand over one part, shuffle the two parts together again, and the cards all disappear!

Or is it? No, not quite. Remember that comult (resp. mult) is a sum of all possible deconcatenations (resp. shuffles). So what this is saying is that the antipode arranges things so that when you sum over all possible deconcatenations and shuffles, they cancel each other out. Cool!

(This sounds like it might have something to do with renormalization in physics, where we want to get a bunch of troublesome infinities to cancel each other out. There is apparently a connection between Hopf algebras and renormalization, but I don't know if this is it. Unfortunately my understanding of physics isn't up to figuring this all out.)


So we've now seen how to define Hopf algebra structures on two different sets of combinatorial objects: trees and permutations. We'll see that these two Hopf algebras are actually quite closely related, and that there is something like a family of combinatorial Hopf algebras with interesting connections to one another.

My main source for this article was Aguiar, Sottile, Structure of the Malvenuto-Reutenauer Hopf algebra of permutations.

No comments:

Post a Comment

Followers