Suppose we have objects A and B in some category. Then their coproduct (if it exists) is an object A+B, together with injections i1: A -> A+B, i2: B -> A+B, with the following universal property: whenever we have arrows f: A -> T and g: B -> T, they can be factored through A+B to give an arrow f+g: A+B -> T, such that f = f+g . i1, g = f+g . i2.
Notice that this definition does not give us a construction for the coproduct. In any given category, it doesn't tell us how to construct the coproduct, or even if there is one. Even if we have a construction for the coproduct in one category, there is no guarantee that it, or something similar, will work in another related category.
In the category Set, the coproduct of sets A and B is their disjoint union. In order to see this, we can work in the category Hask of Haskell types. We can regard Hask as a subcategory of Set, by identifying a type with its set of inhabitants. If a and b are Haskell types / sets of inhabitants, then their disjoint union is Either a b. The elements of Either a b can be from either a or b (hence, from their union), and they are kept disjoint in the left and right parts of the union (so that for example Either a a contains two copies of a, not just one). The injections i1 and i2 are then the value constructors Left and Right. Given f :: a -> t, g :: b -> t, we define:
(f .+. g) (Left a) = f a
(f .+. g) (Right b) = g b
Then it should be clear that (f .+. g) . Left = f, and (f .+. g) . Right = g, as required
In a moment, we'll look at coproducts of vector spaces, but first, as a warmup, let's think about the coproducts in a simpler category: lists / free monoids. Recall that a monoid is an algebraic structure having an associative operation ++, and an identity for ++ called []. (That is [] ++ x = x = x ++ [].)
A monoid homomorphism is a function f :: [a] -> [b] such that f [] = [] and f (a1 ++ a2) = f a1 ++ f a2. With a little thought, you should be able to convince yourself that all monoid homomorphisms are of the form concatMap f', where f' :: a -> [b]. (Which is, incidentally, the same as saying that they are of the form (>>= f').) In the category of free monoids, the arrows are constrained to be monoid homomorphisms.
So for our coproduct, we are looking for an object satisfying the universal property shown in the following diagram:
Perhaps the first thing to try is the disjoint union: Either [a] [b]. This is the coproduct of [a] and [b] as sets, but is it also their coproduct as monoids? Well, let's see.
Hmm, firstly, it's not a list (doh!): so you can't apply ++ to it, and it doesn't have a []. However, before we give up on that, let's consider whether we're asking the right question. Perhaps we should only be requiring that Either [a] [b] is (or can be made to be) a Monoid instance. Is the disjoint union of two monoids a monoid? Suppose we try to define ++ for it:
Left a1 ++ Left a2 = Left (a1++a2)
Right b1 ++ Right b2 = Right (b1++b2)
But now we begin to see the problem. What are we going to do for Left as ++ Right bs? There's nothing sensible we can do, because our disjoint union Either [a] [b] does not allow mixed lists of as and bs.
However, this immediately suggests that we would be better off looking at [Either a b] - the free monoid over the disjoint union of a and b. This is a list - and it does allow us to form mixed lists of as and bs.
We can then set i1 = map Left, i2 = map Right, and these are list homomorphisms (they interact with [] and ++ in the required way). Then we can define:
h = concatMap h' where
h' (Left a) = f' a
h' (Right b) = g' b
So our suspicion is that [Either a b] is the coproduct, with h the required coproduct map.
Let's just check the coproduct conditions:
h . i1
= concatMap h' . map Left
= concatMap f'
= f
and similarly, h . i2 = g, as required.
Notice that the disjoint union Either [a] [b] is (isomorphic to) a subset of the free monoid [Either a b], via Left as -> map Left as; Right bs -> map Right bs. So we were thinking along the right lines in suggesting Either [a] [b]. The problem was that Either [a] [b] isn't a monoid, it's only a set. We can regard [Either a b] as the closure of Either [a] [b] under the monoid operations. [Either a b] is the smallest free monoid containing the disjoint union Either [a] [b] (modulo isomorphism of Haskell types).
(This is a bit hand-wavy. This idea of closure under algebraic operations makes sense in maths / set theory, but I'm not quite sure how best to express it in Haskell / type theory. If anyone has any suggestions, I'd be pleased to hear them.)
Okay, so what about a coproduct in the category of k-vector spaces. First, recall that the arrows in this category are linear maps f satisfying f (a+b) = f a + f b, f (k*a) = k * f a. Again, it should be obvious that a linear map is fully determined by its action on basis elements - so every linear map f :: Vect k a -> Vect k b can be expressed as linear f' where f' :: a -> Vect k b.
Recall that we defined linear f' last time - it's really just (>>= f'), but followed by reduction to normal form:
linear f v = nf $ v >>= f
Okay, so vector spaces of course have underlying sets, so we will expect the coproduct of Vect k a and Vect k b to contain the disjoint union Either (Vect k a) (Vect k b). As with lists though, we will have the problem that this is not closed under vector addition - we can't add an element of Vect k a to an element of Vect k b within this type.
So as before, let's try Vect k (Either a b). Then we can set i1 = fmap Left, i2 = fmap Right, and they are both linear maps by construction. (We don't need to call nf afterwards, since Left and Right are order-preserving.)
i1 = fmap Left
i2 = fmap Right
Then we can define the coproduct map (f+g) as follows:
coprodf f g = linear fg' where
fg' (Left a) = f (return a)
fg' (Right b) = g (return b)
We need to verify that this satisfies the coproduct conditions: f+g . i1 = f and f+g . i2 = g. It would be nice to test this using a QuickCheck property. In order to do that, we need a way to construct arbitrary linear maps. (This is not the same thing as arbitrary functions Vect k a -> Vect k b, so I don't think that I can use QuickCheck's Coarbitrary class - but the experts may know better.) Luckily, that is fairly straightforward: we can construct arbitrary lists [(a, Vect k b)], and then each pair (a,vb) can be interpreted as saying that the basis element a is taken to the vector vb.
type LinFun k a b = [(a, Vect k b)]
linfun :: (Eq a, Ord b, Num k) => LinFun k a b -> Vect k a -> Vect k b
linfun avbs = linear f where
f a = case lookup a avbs of
Just vb -> vb
Nothing -> zero
With that preparation, here is a QuickCheck property that expresses the coproduct condition.
prop_Coproduct (f',g',a,b) =
f a == (fg . i1) a &&
g b == (fg . i2) b
where f = linfun f'
g = linfun g'
fg = coprodf f g
That property can be used for any vector spaces. Let's define some particular vector spaces to do the test on.
newtype ABasis = A Int deriving (Eq,Ord,Show,Arbitrary) -- GeneralizedNewtypeDeriving
newtype BBasis = B Int deriving (Eq,Ord,Show,Arbitrary)
newtype TBasis = T Int deriving (Eq,Ord,Show,Arbitrary)
instance (Num k, Ord b, Arbitrary k, Arbitrary b) => Arbitrary (Vect k b) where
arbitrary = do ts <- arbitrary :: Gen [(b, k)] -- ScopedTypeVariables
return $ nf $ V ts
(I should emphasize that not all vector space bases are newtypes around Int - we can have finite bases, or bases with other interesting internal structure, as we will see in later installments. For the purposes of this test however, I think this is sufficient.)
prop_CoproductQn (f,g,a,b) = prop_Coproduct (f,g,a,b)
where types = (f,g,a,b) :: (LinFun Q ABasis TBasis, LinFun Q BBasis TBasis, Vect Q ABasis, Vect Q BBasis)
> quickCheck prop_CoproductQn
+++ OK, passed 100 tests.
So we do indeed have a coproduct on vector spaces. To summarise: The coproduct of free vector spaces is the free vector space on the coproduct (of the bases).
Next time, we'll look at products - where there might be a small surprise.