Thursday, January 16, 2014

Lamlosuo — eliminating grammatical nouns and verbs

Language is a by-product of general properties of human cognition [...] in conjunction with the constraints on communication that are common to evolved primates [...] and the overarching constraints of human cultures on the languages that evolve from them.
— Daniel L. Everett, Don't Sleep, There are Snakes, 2009, Chapter 15.

This post is about a conlang that seeks to completely undermine the grammatical notions of noun and verb — but leaves the lexical noun/verb distinction more-or-less intact.

In the pipeline, I'm assembling some thoughts on continuations; but this material is ready to run.  I find various insights from Lamlosuo apply to programming language design, since both centrally concern the way the mind processes language — but conlanging is, to me at least (and it seems I'm not the only programming language person who thinks so), a pleasure worth savoring for its own sake.  This also relates to my ideas on the role of fiction in science, briefly mentioned in an earlier post.

Lamlosuo is an exploratory prototype; it's designed to explore the practical consequences of the grammatical premise of a project I've been working on since about the turn of the century (early 2000), in preparation for more naturalistic conlangs to be developed for the project later.  Consequently, the prototype's coverage of different areas of conlang design is spotty; and my account of it here will spotlight particular areas where the exploration has produced interesting results, while inevitably shortchanging, at least for the moment, other areas that don't bear on those particular results.  So don't be surprised by the uneven coverage.

Here's a quick list of some interestingly unexpected developments in Lamlosuo — most of which, alas, will end up deferred to a later blog post (a full description of even a modest conlang is a fairly large beast).

  • I'd first imagined that since Lamlosuo isn't trying to naturalistic, it would have no reason for irregularities.  That seems very naive to me now, but of course my view of it now includes the insights Lamlosuo has given me into the nature of linguistic irregularity.
  • I deliberately omitted conjugation; but a different sort of morphosyntactic variation arose, with curious similarities and differences to conjugation.
  • I'd planned to eventually include an analog to the copular verb, if only to demonstrate that it just wasn't all that important — that it wouldn't be used even though it was there.  When I finally tried to implement that plan, the grammar rejected the addition.
  • I really meant the language to be isolating.  Eventually the structural integrity of the language pushed me to add a stiff dose of incorporation.
  • As alluded to above:  While I eliminated the noun/verb distinction from the grammatical structure of the language, a distinction still naturally arose between lexical nouns and verbs, though the distinction lacks grammatical import.  Moreover, my work on this conlang project spun off a second project that eliminated both lexical nouns and lexical verbs while leaving grammatical nouns/verbs intact.
Since I can only find room in this post to cover one major item from this list, I've chosen incorporation.  Pseudo-conjugation was tempting, but is probably best appreciated when one is already acclimated to the grammatical structure; and tracing through the route to incorporation is a good way to acclimate.

Contents
Preliminaries: Nouns, verbs, and thought
Phonology and phonotactics
Vectors
Role alignment
Gender
Prefixes
Subordinate content particles
Provectors
Incorporation
Preliminaries: Nouns, verbs, and thought

Nouns and verbs play a central role in the grammar of, afaics, all human natlangs, and most (arguably all the naturalistic) conlangs I've seen.  I'm not talking about lexical nouns and verbs; there are North American languages that subvert the traditional distinction between the lexical categories of nouns and verbs, and of course there's Kēlen that subverts lexical verbs.  However, I see all these languages (including Kēlen) retaining nouns and verbs at the grammatical level:  a simple clause is still constructed from a verb with noun arguments, which information can then be distributed amongst words in various ways.

This conlang project undertakes to demonstrate a naturalistic class of languages that do not use the verb-with-noun-arguments grammatical construct.  That is, I'm trying to build the conlangs and describe a species of conspeakers whose thought processes naturally give rise to these languages.  On the face of it, this sounds like some sort of investigation of either Whorfianism, or Chomskyism, or both; but it isn't either by intent.

I was first struck by the grammatical centrality of nouns and verbs when contemplating the extreme non-centrality of English prepositions, whose meanings can be wildly context-dependent (and accordingly very difficult to translate to/from other languages).  I mused on the possibility of a language in which pairwise relations between nouns/verbs are more grammatically important, while nouns/verbs themselves are more context-dependent.  Meditating on this idea for a dozen years or so, I settled on the notion that while we evolved on land, where survival favored structuring our thoughts in terms of an activity and its participants, the conspeakers evolved in the open ocean where survival favored structuring their thoughts in terms of navigation.  I'd been inspired here by a claim, in the context of a discussion of the fact that dolphins and whales have bigger brains than we do, that while their brains are bigger, most of that extra wetware is oriented toward navigation; and, drawing further inspiration from the prototype of a treasure map —so many paces that way, then turn to face that thing and go till you come to such-and-such, then etc. etc.— I figured a simple clause for the conspeakers would be an arbitrarily long chain of content words, each with a specified semantic connection to the next in the chain.

Why the emphasis on how the grammatical structure arises from the conspeakers' thought processes?  Because I don't buy in to the incomprehensible aliens motif; I didn't think humans would naturally develop languages with this sort of grammar (insert anadew disclaimer here), therefore I wouldn't consider it plausible unless I could explain why some alien species would do so.  I am inclined to believe there is a principle for general-purpose intelligence analogous to computation's Church-Turing conjecture:  just as any two sufficiently powerful computation engines are capable of simulating each other, any two minds with general-purpose intelligence are capable of comprehending each other — in principle.  Intelligent minds can easily fail to figure each other out —last I heard, we still weren't even sure whether or not rongorongo is writing, let alone what it means— but this is quite different from supposing some fundamental limitation of thought processes that would inherently prevent them from comprehending each other.  I consider the incomprehensible-aliens gambit a bad tactic in science fiction, because I think the reader can feel the difference if they're reading fiction with no coherent framework behind what they can see.  The author should know how the aliens think.  (I'm similarly skeptical about the religious motif of things beyond mortal comprehension.)

The project wasn't meant to explore Chomskyan ideas because, to be brutally honest, I've never taken the universal grammar idea seriously enough even to try to refute it.  When I first read the technical papers in which Chomsky defined the Chomsky hierarchy (my first exposure to his work, as the hierarchy bears on programming-language parsing technology), I was turned off by the apparent suggestion that we have string rewriting engines in our heads; to me, a hypothesis both unnecessary and — because it felt, for lack of a better term, digital — implausible.  At any rate, there is no universal grammar consciously behind my conlangs; and, regardless of whether the universal grammar hypothesis is right or wrong, I doubt an attempt to explicitly address it would do anything for the project except bog things down.

Since the project aspires to naturalism, to my mind this requires at least one diachronic family tree of languages, recalling my earlier remark that plausible fiction has a coherent framework behind it.  The tree has to be sketched out before elaborating the languages in it (or they wouldn't be affected by it, defeating the purpose).  At the start of the project, though, I had no clue how to sketch out such a tree because I'd already merrily invalidated the basic classifications I knew of for human languages:  (1) neutral order of the basic sentence elements subject, object, and verb (VSO, SVO, etc.); in this project, a sentence doesn't have these elements; and  (2) alignment system, nominative ergative or whatever; but alignment is all about coordinating the arguments to a verb, and again, in this project a sentence has no such elements.

It seemed, then, that in order to learn the things I'd need to know to sketch out a family tree of these languages, I would have to have already constructed one, to learn how such languages can work in practice.  So the prototype language would be optimized for use as an exploratory vehicle, merrily disregarding naturalism unless, of course, in some particular instance a facet of naturalism were the thing being explored.  Conlangers are often advised to deliberately add irregularities to their languages for naturalistic feel, and from this advice I'd picked up the sense that naturalism would be the only reason for irregularity in a conlang; so I had the unconsidered expectation that the non-naturalistic prototype would have no irregularities.  I was gradually disabused of this expectation.  The prototype is unnaturally regular, though, so keep in mind that this isn't a design flaw.

Phonology and phonotactics

The phonology is the first of many areas of linguistics that the prototype language mostly doesn't explore (the better, hopefully, to focus on other areas).  The larger project design discusses phonetics and phonology extensively.  The conspeakers vocalize in pure tones; the phones (speech sounds) are notes, but the phonemes are the musical intervals from one note to the next (which meshes rather well, imho, with the vectorial/navigational mindset behind the languages).  The conspeakers, realizing they were somewhat isolated by the unpronounceability of their speech sounds to most species that speak with their mouths, constructed a standard orthophony for transdicting their speech into sequences of "oral phonemes", chosen to be manageable by most other species.  There's some good fun there, but in the prototype it would only get in the way.  The prototype phonology needs to be easily constructed and easy (for me) to pronounce, so not to discourage exploration.

I started out designing a thoroughly bland phonology.  For vowels I made a simple neutral choice:

 front   back 
close i u
mid e o
open a
I gave syllables the general form consonant-vowel-consonant, where the onset and coda would both be optional depending on context; the coda could only be a nasal — either n or m — and nasals would only be allowed in syllable codas.  The rhotic consonant r would be omitted.  These are simple generic choices.

At this point, though, it became clear that the phonology was apt to become so boring it would discourage use.  So I slipped in an experiment I'd been curious about, bearing no relation to the project at all except that it seemed to me to belong in a language that doesn't care about naturalism.  I'd wondered how well one could put together a phonology with neither plosives nor fricatives.

Unfortunately, since I was going for ease of pronunciation, and I'd already excluded onset-nasals and r, excluding plosives and fricatives left me with simply too few consonants to choose from.  I adopted three approximants — l j w, as in lore, yore, wore — but still had fewer onset consonants than vowels, which seemed wrong so I added in, after all, two unvoiced fricatives — f and s, as in fore, sore.  (Why unvoiced?  It felt right, phonaesthetically.)

 labial   labio-
dental 
 alveolar   palatal   velar 
nasal m n
fricative f s
approximant
 
l
(lateral)
j
 
w
(labialized)

The arrangement of five onsets and five vowels interacted with the artificial regularity of the language, to produce a great many features of the language coming in groups of five.  The high sonority also gives the language an odd, somewhat fluid phonaesthetic.

Eventually a plosive found its way into the language, by a circuitous route; but I'll get to that.

In theory, a phonotactic rule forbids two consecutive vowels in a word.  However, in some cases, two vowels with an approximant between them sound the same as if the vowels were adjacent — for example, ije sounds like ie — and if one of these vowel-consonant-vowel sequences occurs in a word, the consonant is elided when spelling the word.  So the word is constructed without consecutive vowel phonemes, but it's spelled with consecutive vowel letters.  Precisely:  if a front vowel (i or e) is followed by j and another vowel, or if a back vowel (u or o) is followed by w and another vowel, the consonant between those vowels is elided.  For example, lamlosuwo would be shortened to lamlosuo.  Why introduce these elision rules?  Because I found I was pronouncing the phoneme sequences that way anyhow, and it made the orthography (spelling) less uniform and therefore easier to get one's bearings in (like the bumps on the letters F and J on a qwerty keyboard, that help touch typists feel when their fingers are in the right place).

Vectors

All the conspeakers' content words are vectors, a single grammatical function alternative to grammatical nouns or verbs (hence, the entire phylum of languages are vector languages).  Each vector describes a sort of travel, so might be thought of as a variant on the verb to go.  To identify participants in the action of a vector, any given vector may recognize a series of available vector roles, in poetic similarity to noun cases of human natlangs.  (Imagine conspeaker linguists being frustrated to encounter a language, culturally distant from theirs, in which vectors don't have fixed sets of roles, and struggling to rescue their cherished theory of grammar.)

The prototype language has five roles:

abbr  description
cursor  CUR the thing that goes.
start STR from which it goes.
end END to which it goes.
path PTH  by which it goes.
pivot PIV catch-all role; a participant not belonging to any of the other four.
Some roles of a given vector may be unoccupied, but there is always a cursor.

Enumerating the roles of a vector turns out to be a very good way of defining the vector.  My first vector definition was for a vector meaning roughly speak

cursor  what is said.
start who says it.
end who it is said to.
path unoccupied.  (Several things might go here, but I've not yet chosen one.)
pivot the language in which it is said.
A vector word has an invariant stem, and a mandatory class suffix.  A vector stem is two or more consonant-vowel syllables.  (Put the accent on the first syllable of the stem, btw.)  There are eleven classes — the neutral class, and ten genders.  The neutral suffix depends on the final vowel of the stem:  after a back vowel it's -wa, after a front or open vowel it's -ja (this is chosen so that the consonant on the neutral suffix is elided except when the stem ends with a).

An engendered vector is sort-of-like a noun.  The suffix has the form consonant-vowel, where the consonant determines one of the five roles to be emphasized — making it noun-like — and the vowel determines whether the occupant of that role is volitional or nonvolitional:  a back vowel for volitional, front vowel for nonvolitional.  I'll hold off on enumerating the gender suffixes for a moment, though, because they make much more sense after seeing the role particles, which I'm about to explain.

Role alignment

As one way to orchestrate a connection between consecutive vectors, the prototype specifies a role of each vector, at which they intersect (I imagined something vaguely like tinkertoy knobs, with a choice of sockets to plug into).  There's scope for some good fun in devising alternative ways that vector languages might connect consecutive vectors, but I figured this would do nicely to explore the vector-language concept.

I figured on inserting role particles between consecutive vectors to specify their connection.  I also figured a role particle would be a single consonant-vowel syllable, making these particles immediately distinguishable from vectors; but this presented a problem.  With five roles in each vector, there are twenty five ways to choose a role of the first vector and a role of the second vector; and there are only twenty five consonant-vowel syllables.  So one would have to use every possible syllable, and there would be some very dense way of packing the two role-choices into that one syllable (such as, the consonant determines the first role and the vowel determines the second), with no repeated patterns to aid memorization, and no redundant information to allow for transmission errors.  So instead of a single role particle, put two particles between each pair of vectors:  first a dominant role particle specifying a role of the first vector, then a subordinate role particle specifying a role of the second vector.  The consonant in a role particle indicates the role, and the vowel is back for dominant, front for subordinate.  The particles are easier to memorize, and the pairing of close or mid vowels with the consonants provides redundancy against transmission errors.  This arrangement also left available five possible role particles using the open vowel, so I made these combined role particles, indicating in a single particle that that role was being selected for both vectors.

a
 (open) 
 
e
 (front 
mid)
i
 (front 
 close) 
o
 (back 
mid)
u
(back
 close) 
l la li lu cursor
f fa fi fu start
s sa se so end
j ja je jo path
w wa we wo pivot
comb. subordinate dominant
This table has changed a couple of times since creation.  Originally the start and end both used close vowels, which turned out to be a mistake because one really wants redundant contrasts between start and end.  Also, originally the path used a close vowel for the subordinate and mid for dominant, which had been naively intended to enhance redundant contrasts but instead turned out to be confusing since each of the other consonants used a predictable pair of vowels, and path seems to be, in practice, the least often used of the five roles.

Usually, aligning specified roles of two consecutive vectors means that the same participant occupies both roles; however, in general the meaning of the alignment is determined by role-alignment conventions of the dominant (i.e., first) vector.  In theory, the "usual" meaning of role alignment is simply the convention most vectors adopt.  Each vector has its own unique semantic "shape" that may call for non-generic role-alignment conventions.

For example, combining losua meaning speak with a second vector susua meaning sleep one could say

 losua   fu   li   susua 
 NEUT 
 speak 
 DOM 
 STR 
 SUB 
 CUR 
 NEUT 
 sleep 
Since the start of losua is the speaker, and the cursor of susua is the sleeper, and losua has generic role alignment, this means that some party speaks and sleeps.  One might ask whether the participant does these things simultaneously or sequentially; losua prefers to align sequentially (rather than in parallel), so the participant speaks and then sleeps.  (Note, in passing, that there would be no way in English to say this without a subject for the verbs, such as "someone", whereas our prototype sentence has no word corresponding to the subject.)

Combined particles turned out to be highly useful, but not for their original purpose.  In practice, it seems, one rarely wants to align the same role of two consecutive vectors.  However, occasionally it is necessary or convenient for a vector to connect with other vectors in a way that doesn't fit within the usual two-particle model.  Exactly because the combined particles aren't much needed for their regular function, they can be conveniently repurposed for irregular functions; and this has rapidly become a usual practice.  Just as a vector's pivot role and two-particle role-alignment conventions absorb its low-grade idiosyncrasies, its use of combined role particles absorbs medium-grade idiosyncrasies.

After some initial fumbling in the design, neutral vectors were also allowed to occur consecutively with no intervening particles.  The two vectors fuse semantically, so that they interact with the rest of the sentence as if they were a single vector.  The alignment preference of the dominant vector determines whether they fuse in parallel, a single act of travel that is both things at once; or in sequence, a single act of travel that is the first and then the second.  This serves two commonly wanted patterns:  the parallel case is essentially adverbial (advectorial), while the second supports compound navigational paths (as in the treasure-map metaphor).

Gender

The dominant and subordinate role particles are reused as gender suffixes — dominant for volitional genders, subordinate for nonvolitional genders — with one caveat.  The labio-dental fricative f has an allophone used only in start-gender suffixes:  the dental fricative, as in thin, written as t.  Use of this allophone provides redundant information to help filter out transmission errors, which is a solid justification for it; causally, I like the unvoiced dental fricative.

 NV  V
CUR -li -lu
STR -ti -tu
END  -se -so
PTH -je -jo
PIV -we  -wo 
For example, here are the (usual) engendered forms of losua and susua.  Grammatically, of course, these aren't actually nouns, even though the following list describes the engendered participants:  grammatically they're vectors, so it's still possible to align with any of their non-engendered roles.
losuli  NV.CUR  message
losutu  V.STR  speaker
losuso  V.END  audience
losuo  V.PIV  living natural language
losue  NV.PIV  dead or artificial language
susulu  V.CUR  sleeper
susuti  NV.STR  falling asleep
sususo  V.END  waking
susuje  NV.PTH  sleeping
susue  NV.PIV  dreaming
Semantically, most vectors when engendered take on habitual aspect; so, for example, losutu is a sometime-speaker, rather than the speaker on a particular occasion.  Syntactically, if you are role-aligning on the engendered role of a vector, you can omit the role particle that says so.  Thus,
 losutu   li   susua 
 V.STR 
 speak 
 SUB 
 CUR 
 NEUT 
 sleep 
the sometime-speaker sleeps
 losua   fu   susulu 
 NEUT 
 speak 
 DOM 
 STR 
 V.CUR 
 sleep 
the sometime-sleeper speaks
 losutu   susulu 
 V.STR 
 speak 
 V.CUR 
 sleep 
the sometime-speaker (is a) sometime-sleeper

Prefixes

A vector can be modified by one or more prefixes of the form consonant-vowel-nasal.  To provide redundancy, I'd like to avoid using two prefixes that differ only by n-versus-m, leaving only twenty five possible prefixes; naturally, I'm trying to reserve them for elemental functions.  I revise the prefix set from time to time, as my ideas evolve about what is most useful.

An especially elemental, and stable, prefix is lam-, which has deictic effect, causing the vector to refer to the immediate situation.  This is notably used with engendered forms of losua:

lamlosuli  what is now being said (the utterance in which the word lamlosuli occurs).
lamlosutu  the party now speaking (first person).
lamlosuso  the party now being spoken to (second person).
lamlosuo  the language now being spoken.
It was rather cool to get the name of the conlang for free (choosing to assume it would be imported into English as a proper noun, so it would still mean the conlang when used in an English sentence); but lamlosutu and lamlosuso seemed excessively clumsy ways to express something as basic as the first and second persons.  Such clumsiness didn't match the exploratory mission of the language; so I posited there would be contractions for those words — latu for first person, laso for second.
 laso   fi   losua 
 V.END 
 this 
 speech 
 SUB 
 STR 
 
 NEUT 
 speak 
 
you speak
Subordinate content particles

As part of its exploratory mission, the prototype is meant to have a minimalist grammar.  The theory is that an exploratory design has to be changed, and then the revised version has to be relearned — and vocabulary is (says the theory) easier to change, and easier to relearn, than grammar.  In particular, I didn't want to conjugate vectors, because that weighs down the grammar.  How, then, to express something like past tense?

First I crafted a vector for the purpose:  silea, meaning age.  The cursor silelu would be the party who ages, start siletu the time they came from, etc.; but deictic lamsilea seemed so much more useful than silea itself that I figured in ordinary usage one would let the prefix be understood and simply say "siletu" for "the past", and so on.  (Notice how little irregularities are piling up?)

To relate an arbitrary sentence to this time-sense vector, I crafted a new set of grammatical words — subordinate content particles, each of which is just a single vowel.

Siletu a laso fi losua.
The subordinate content particle initiates a subclause, here laso fi losua, and then packages up that clause as if it were the occupant of the subordinate role in a role alignment.  The vowel in the content particle determines the mood of the subclause, here indicative.  The alignment with siletu is understood to specify location of the subclause (here, location in time) — so, "you spoke".  (Is this really a practical way to handle tense?  I don't claim to know; this is a game of gradual exploration.)
 siletu   a   laso   fi   losua 
 V.STR 
 this 
 aging 
 INDIC 
 
 
 V.END 
 this 
 speech 
 SUB 
 STR 
 
 NEUT 
 speak 
 
in the past: you speak
To assign mood to an entire sentence, simply place the appropriate-mood subordinate content particle at its start.  Using the invitational particle i,
 i   laso   fi   losua 
 INVIT 
 
 
 V.END 
 this 
 speech 
 SUB 
 STR 
 
 NEUT 
 speak 
 
please, you speak
When an invitational or imperative clause starts with laso which is then aligned with the agent of a neutral vector, the laso and its associated particle are usually left implicit, unless one wants to emphasize the second-person.  (Yes, this is very like English, where one says "please, speak" or "please, you speak"; although ordinarily one would avoid imitating English, here it's potentially interesting that some things aren't necessarily affected by the switch from verb-noun to vector grammar.)  Agents, and one or two other logical roles in a vector, pop up from time to time as the prototype develops; their use in the invitational/imperative elision convention wasn't a novelty.  The agent is a participant understood as causing the action, and if there is an agent it's usually either the cursor, the start, or the pivot.
I losua. — Please speak.
I losua so latu. — Please speak to me.
I losua wo lamlosuo. — Please speak in Lamlosuo.

Here's the complete set of subordinate content particles.

 front   back 
close i
invitational
u
 imperative 
mid e
 noncommittal 
o
tentative
open a
indicative

Provectors

What if one wanted to say "please speak to me in Lamlosuo"?  The linear simple clause structure only allows aligning a vector with two others:  the one before it, and the one after it.  In our examples, the one before losua is the elided second person.  The one after it could be pivot lamlosuo or end latu.  We don't have any way to align all three with losua at once.

In my earliest sketch of the grammar, I provided for occasionally non-linear structure by means of a device called a recollective provector.  Provectors have a stem of the form vowel-nasal, and take a class suffix agreeing with their antecedent.

 front   back 
close in-
 interrogative 
um-
 recollective 
mid en-
indefinite
on-
relative
open an-
demonstrative

The recollective provector um- refers to an antecedent that occurred earlier in the same clause.  What makes it recollective is that it doesn't align with its syntactic predecessor.

 i   losua   so   latu   uma   wo   lamlosuo 
 INVIT 
 
 
 NEUT 
 speak 
 
 DOM 
 END 
 
 V.STR 
 this 
 speech 
 REC 
 NEUT 
 
 DOM 
 PIV 
 
 V.PIV 
 this 
 speech 
please speak to me, that in Lamlosuo
From losua, one starts building a linear clause, but then stops when one discovers the next word is a recollective provector (losua so latu), goes back to the antecedent losua, and begins building another linear clause from there (losua wo lamlosuo). As mentioned, the recollective provector device was created early, in anticipation of later need.  Its actual use didn't begin to arise for some time, until the vocabulary became sufficiently diverse to support a significant number of sentences with more than three vectors in them.  So only then did it become apparent that there were two problems with the recollective provector.

Incorporation

The whole point of the vector-languages concept is that the conspeakers, in fluent speech, are liable to produce long chains of vectors.  Of course we've been translating small, simple English test sentences, so it's to be expected that our vector sentences don't have long simple clauses; but sooner or later, for the project to succeed, fluent vector speech will have to be demonstrated with long vector chains in it.  And both problems with recollective provectors, as originally designed, are related to long simple clauses.

The simpler problem is that in a long simple clause, we expect to find a number of neutral vectors — so that class agreement may be simply not adequate to identify the antecedent of a recollective provector.  This isn't too worrisome; it's a technical problem, but doesn't seem foundational.

There is a deeper problem, though.  Aware from the start that I would eventually have to learn to phrase things fluently — with long simple clauses — my initial plan was to concentrate on creating a language that would support fluent phrasing, and then see if I couldn't ease myself into using more fluent phrasing later on.  The above example, though — i losua so latu uma wo lamlosuo — suggests that the language I'd created might not be able to support long simple clauses:  that any complex sentence would have nonlinear structure requiring use of recollective provectors, and the recollective provectors would chop everything up into very short simple clauses (the longest simple clause in the example is one neutral vector surrounded by two engendered vectors, which might as well be SVO).

Where I'd originally meant to ease into fluent phrasing by means of the prototype, suddenly it looked needful to come up with an example of fluent vector phrasing independent of the prototype.  I've a story, second-hand, of a native German speaker discovering to his pleasure the English term coffee table book; there is something really very German about it (though in German such a thing would likely be one word, coffeetablebook), and one can see how it might strike a German speaker as a breath of air from home.  That is what I wanted:  a (metaphorical) coffee table book for vector speakers.

My test sentence:  "Over the river and through the woods to grandmother's house we go." It does seem rather navigational.  The basic structure (not worrying about specific vocabulary since that's not what we're exploring atm) might be

 ---lu   li   ---a   ---a   so   ---we   lu   ---tu 
 V.CUR 
 group 
 travel 
 SUB 
 PIV 
 
 NEUT 
 over 
 
 NEUT 
 through 
 
 DOM 
 END 
 
 NV.PIV 
 reside 
 
 DOM 
 CUR 
 
 V.STR 
 lineage 
 
we go over-and-then-through to the residence of an ancestor

It seems unlikely the language would have a special vocabulary primitive for over-the-river, or for through-the-woods; more plausibly these would be formed by attaching pivots, respectively river and woods, to more generic primitives for go-over-something and go-through-something.  Extrapolating from the example, I conjecture that any time you have a long chain of vectors offering an opportunity for a long simple vector clause, it's likely that key elements of the chain are too specific to have vocabulary primitives.  And if the only way we have to attach modifiers is using a recollective provector, we're forced to either chop up the longer structure in order to splice in modifiers, or put all the modifiers at the end of the sentence.  Imho, putting all the modifiers at the end doesn't seem like a very navigational organization (although I think there may be some human languages that do weird stuff like that — anadew again).

This is a much narrower problem than the big, general problem addressed by recollective provectors, which are good for building almost arbitrarily complex sentences.  We just need to be able to glide over a few modifiers here and there without disrupting the simple clause structure; and for that I crafted a simple incorporation device.  Any simple clause can be fused into a single word by putting a dab of morphosyntactic glue between each pair of consecutive words; the entire word then behaves, toward the rest of the sentence, as its leading vector — as if a recollective provector had been used, but without disrupting the flow of the clause in which it occurs.  The dab of morphosyntactic glue should be something not found elsewhere; so I used a plosive — unvoiced alveolar, as in tore — pronounced as part of the onset of the following syllable, and written as an apostrophe (both to make its partitioning of the word visually prominent, and to distinguish it from the t allophone of f).

I losua'so'latu wo lamlosuo. — Please speak-to-me in Lamlosuo.

While we're at it, this also offers a possible solution to the problem of ambiguous recollective provectors.  Simply incorporate into the provector a repetition of the antecedent word; in our (admittedly trivial) example,

I losua so latu uma'losua wo lamlosuo.please speak to me, that speaking in Lamlosuo

3 comments:

  1. Interesting way to do away with parts of speech as we know them.

    At first I thought "vector" meant "array of 5 things". It was only after being puzzled and rereading that I realized it meant like a geometrical vector from point A to point B.

    One thing that surprised me was that "Over the river and through the woods to grandmother's house we go" had no path roles. Did I misread? Perhaps I am not understanding roles or the PTH role.

    I once read a paper by Murat Kural arguing that the semantic component "make" (a la Saul Perlmutter, I think) can be chained indefinitely many times, so the treatment of volitional and nonvolitional interests me. I see where gender neatly distinguishes "I rolled down the hill (on purpose)" from "I rolled down the hill (accidentally)", as English does not. But can it express chained makes, like "I made Sam make Mary roll Bill down the hill"?

    Anyways, interesting.

    ReplyDelete
    Replies
    1. In "Over the river and through the woods to grandmother's house we go", the path role is involved when splicing together the consecutive neutral vectors go-over and go-through. The splicing involves multiple roles; but if one were to use an explicit role particle for this, it would probably be ja, the combined path particle.

      I do have a way in the prototype to express causation — I cause (Sam causes (Mary causes (Bill rolls down the hill))) — even though there's lots I haven't got to yet. The device I have for it should get covered in a second post on Lamlosuo (eventually), as it's related to both irregularity and "pseudo-conjugation". Technically, it constructs a compound clause, with an implicit subordinate content particle hidden inside a combined role particle. Something like "Latu fajoo ja <Sam> fajoo ja <Mary> fajoo ja Bill li <roll> wo <hill>".

      Delete
    2. The question about mutli-level causation really has me thinking. Some vectors use their pivot for an optional agent (this being one of a handful of common uses for the pivot) — for example, the classic insult "you make me puke" would be laso we wafoa fu latu, or latu wo wafoa fi laso (though one could also go for the compound laso fajoo ja latu fi wafoa) — but to achieve multiple levels of causation this way would require a causation vector with a certain kind of structure. Vector fajoa wouldn't do for the purpose, as its use to express causation requires a subordinate clause; so now I'm thinking about how to craft a vector for chained causation.

      Delete