Saturday, 16 April 2011

What is an Algebra?

Over the last few months, we've spent somewhat longer than I originally expected looking at vector spaces, direct sums and tensor products. I hope you haven't forgotten that the reason we were doing this is because we want to look at quantum algebra, and "quantum groups". What are quantum groups? Well, one thing they are is algebras - so the next thing we need to do is define algebras.

Informally, an algebra is just a vector space which is also a ring (with unit) - or to put it another way, a ring (with unit) which is also a vector space. So a straightforward definition would be, A is an algebra if
(i) A is an additive group (this is required by both vector spaces and rings)
(ii) There is a scalar multiplication smult :: k×A -> A (satisfying some laws, as discussed in a previous post)
(iii) There is a multiplication mult :: A×A -> A, satisfying some laws:
- mult is associative: a(bc) = (ab)c
- mult distributes over addition: a(b+c) = (ab)+(ac), (a+b)c = (ac)+(bc)
(iv) There is a unit :: A, which is an identity for mult: 1a = a = a1

Some examples:
- C is an algebra over R (2-dimensional as a vector space)
- 2×2 matrices over a field k form a k-algebra (4-dimensional)
- polynomials over a field k form a k-algebra (infinite-dimensional)

It would be fairly straightforward to translate these definitions into a Haskell type class as they are. However, we're going to do things slightly differently, for two reasons.

Firstly, we would like to use the language of category theory, and define multiplication and unit in terms of linear maps (arrows in the category of vector spaces). Specifically, we define linear maps:
unit :: k -> A
mult :: A⊗A -> A

We then require that the following diagrams commute:

Associativity:



When we say that this diagram commutes, what it means is that it doesn't matter which way you decide to follow the arrows, the result is the same. Specifically:
mult . mult⊗id == mult . id⊗mult

Unit:



In this case there are two commuting triangles, so we are saying:
mult . unit⊗id == unitOutL
mult . id⊗unit == unitOutR
(where unitOutL, unitOutR are the relevant isomorphisms, which we defined last time.)

But hold on, haven't we forgotten one of the requirements? What about distributivity? Well, notice that the signature of mult was A⊗A -> A, not A×A -> A. A linear map from the tensor product is bilinear in each component (by definition of tensor product) - and bilinearity implies distributivity. So in this version, distributivity is built into the definition of mult. Neat, eh?

The second reason our definition will be different is that in HaskellForMaths, all our vector spaces are free vector spaces over some basis type b: V = Vect k b. Consequently, our algebras will be A = Vect k a, where a is a k-basis for the algebra. Because of this, it will turn out to be more natural to express some things in terms of a, rather than A.

With those forewarnings, here's the HaskellForMaths definition of an algebra:


module Math.Algebras.Structures where

import Math.Algebras.VectorSpace
import Math.Algebras.TensorProduct

class Algebra k a where
    unit :: k -> Vect k a
    mult :: Vect k (Tensor a a) -> Vect k a

In this definition, a represents the basis of the algebra, not the algebra itself, which is A = Vect k a. Recall that we defined a type Tensor a b, which is a basis for Vect k a ⊗ Vect k b.


If we wanted to stay a bit closer to the category theory definition, we could try continuing with the type family approach that we looked at last time:


type TensorProd k u v =
    (u ~ Vect k a, v ~ Vect k b) => Vect k (Tensor a b)

class Algebra2 k a where
    unit2 :: k -> a
    mult2 :: TensorProd k a a -> a

In this definition a is the algebra itself. I'm not going to pursue that approach any further here, but if anyone fancies giving it a go, I'd be interested to hear how they get on.


Anyway, as discussed, unit and mult are required to be linear maps. We could check this using QuickCheck, but in practice we will always define unit and mult in such a way that they are clearly linear.

However, we can write a QuickCheck property to check the other requirements:


prop_Algebra (k,x,y,z) =
    (mult . (id `tf` mult)) (x `te` (y `te` z)) ==
         (mult . (mult `tf` id)) ((x `te` y) `te` z)              && -- associativity
    unitOutL (k' `te` x) == (mult . (unit' `tf` id)) (k' `te` x)  && -- left unit
    unitOutR (x `te` k') == (mult . (id `tf` unit')) (x `te` k')     -- right unit
    where k' = k *> return ()

(Recall that when we wish to consider k as a vector space, we represent it as the free vector space Vect k ().)


When we have an algebra, then we have a ring, so we can define a Num instance:


instance (Num k, Eq b, Ord b, Show b, Algebra k b) => Num (Vect k b) where
    x+y = x <+> y
    negate x = neg x
    x*y = mult (x `te` y)
    fromInteger n = unit (fromInteger n)
    abs _ = error "Prelude.Num.abs: inappropriate abstraction"
    signum _ = error "Prelude.Num.signum: inappropriate abstraction"

This means that when we have an algebra, we'll be able to write expressions using the usual arithmetic operators.


Okay, so how about some examples of algebras. Well, we mentioned the complex numbers as an algebra over the reals, but let's go one better and define the quaternion algebra. This is a four dimensional algebra over any field k, which is generated by {1, i, j, k}, satisfying the relations i^2 = j^2 = k^2 = ijk = -1. (It follows, for example that (ijk)k = (-1)k, so ij(k^2) = -k, so ij = k.)

Here's the code. First, we define our basis. The quaternions are traditionally denoted H, after Hamilton, who discovered them:


data HBasis = One | I | J | K deriving (Eq,Ord)

type Quaternion k = Vect k HBasis

i,j,k :: Num k => Quaternion k
i = return I
j = return J
k = return K

instance Show HBasis where
    show One = "1"
    show I = "i"
    show J = "j"
    show K = "k"

instance (Num k) => Algebra k HBasis where
    unit x = x *> return One
    mult = linear m
         where m (One,b) = return b
               m (b,One) = return b
               m (I,I) = unit (-1)
               m (J,J) = unit (-1)
               m (K,K) = unit (-1)
               m (I,J) = return K
               m (J,I) = -1 *> return K
               m (J,K) = return I
               m (K,J) = -1 *> return I
               m (K,I) = return J
               m (I,K) = -1 *> return J

Note that unit and mult are both linear by definition.

Let's just check that the code works as expected:

> :l Math.Algebras.Quaternions
> i^2
-1
> j^2
-1
> i*j
k


Now, are we sure that the quaternions are an algebra? Well, it's clear from the definition that the left and right unit conditions hold - see the lines m (One,b) = m (b,One) = return b. But it's not obvious that the associativity condition holds, so perhaps we should quickCheck:


instance Arbitrary HBasis where
    arbitrary = elements [One,I,J,K]

prop_Algebra_Quaternion (k,x,y,z) = prop_Algebra (k,x,y,z)
    where types = (k,x,y,z) :: (Q, Quaternion Q, Quaternion Q, Quaternion Q)

> quickCheck prop_Algebra_Quaternion
+++ OK, passed 100 tests.


Ok, how about 2*2 matrices. These form a four dimensional algebra generated by the elementary matrices {e11, e12, e21, e22}, where eij is the matrix with a 1 in the (i,j) position, and 0s elsewhere:


data Mat2 = E2 Int Int deriving (Eq,Ord,Show)

instance Num k => Algebra k Mat2 where
   unit x = x *> V [(E2 i i, 1) | i <- [1..2] ]
   mult = linear mult' where
        mult' (E2 i j, E2 k l) = delta j k *> return (E2 i l)

delta i j | i == j    = 1
          | otherwise = 0

Notice the way that we only have to define multiplication on our basis elements, the elementary matrices eij, and the rest follows by bilinearity. Notice also that unit and mult are linear by definition. Let's just sanity check that this works as expected:

> :l Math.Algebras.Matrix
> unit 1 :: Vect Q Mat2
E2 1 1+E2 2 2
> let a = 2 *> return (E2 1 2) + 3 *> return (E2 2 1)
> a^2
6E2 1 1+6E2 2 2


It's straightforward to define an Arbitrary instance for Mat2, and quickCheck that it satisfies the algebra conditions.


Finally, what about polynomials? Let's go one better, and define polynomials in more than one variable. An obvious basis for these polynomials as a vector space is the set of monomials. For example, polynomials in x,y,z are a vector space on the basis { x^i y^j z^k | i <- [0..], j <- [0..], k <- [0..] }.

Recall that our vector space code requires that the basis be an Ord instance, so we need to define an ordering on monomials. There are many ways to do this. We'll use the graded lex or glex ordering, which says that monomials of higher degree sort before those of lower degree, and among those of equal degree, lexicographic (alphabetical) ordering applies.

Given any set X of variables, we can construct the polynomial algebra over X as the vector space with basis the monomials in X. In the example above, we had X = {x,y,z}. For this reason, we'll allow our glex monomials to be polymorphic in the type of the variables. (In practice though, as you will see shortly, we will often just use String as the type of our variables.)

So a glex monomial over variables v is basically just a list of powers of elements of v:

data GlexMonomial v = Glex Int [(v,Int)] deriving (Eq)
-- The initial Int is the degree of the monomial. Storing it speeds up equality tests and comparisons


For example x^3 y^2 would be represented as Glex 5 [("x",3),("y",2)].


instance Ord v => Ord (GlexMonomial v) where
    compare (Glex si xis) (Glex sj yjs) =
        compare (-si, [(x,-i) | (x,i) <- xis]) (-sj, [(y,-j) | (y,j) <- yjs])
-- all the minus signs are to make things sort in the right order

[There's a bug in the HaskellForMaths v0.3.2 version of this Ord instance - the above code, which fixes it, will be in the next release.]

I won't bore you with the Show instance - it's a bit fiddly.

The Algebra instance uses the fact that monomials form a monoid under multiplication, with unit 1. Given any monoid, we can form the free vector space having the monoid elements as basis, and then lift the unit and multiplication in the monoid into the vector space, thus forming an algebra called the monoid algebra. Here's the construction:


instance (Num k, Ord v) => Algebra k (GlexMonomial v) where
    unit x = x *> return munit
        where munit = Glex 0 []
    mult xy = nf $ fmap (\(a,b) -> a `mmult` b) xy
        where mmult (Glex si xis) (Glex sj yjs) = Glex (si+sj) $ addmerge xis yjs

(In the addmerge function, we ensure that provided the variables were listed in ascending order in both inputs, then they are still so in the output.)

Finally, here's a convenience function for injecting a variable into the polynomial algebra:

glexVar v = V [(Glex 1 [(v,1)], 1)]

Then for example, we can do the following:

type GlexPoly k v = Vect k (GlexMonomial v)

> let x = glexVar "x" :: GlexPoly Q String
> let y = glexVar "y" :: GlexPoly Q String
> let z = glexVar "z" :: GlexPoly Q String
> (x+y+z)^3
x^3+3x^2y+3x^2z+3xy^2+6xyz+3xz^2+y^3+3y^2z+3yz^2+z^3


I hope you were impressed at how easy that all was. The foundation of free vector spaces, tensor products and algebras is only around a hundred lines of code. It then takes just another dozen lines or so to define an algebra: just define a basis, and define how the basis elements multiply - the rest follows by linearity.

No comments:

Post a Comment

Followers