Студопедия

Главная страница Случайная страница

Разделы сайта

АвтомобилиАстрономияБиологияГеографияДом и садДругие языкиДругоеИнформатикаИсторияКультураЛитератураЛогикаМатематикаМедицинаМеталлургияМеханикаОбразованиеОхрана трудаПедагогикаПолитикаПравоПсихологияРелигияРиторикаСоциологияСпортСтроительствоТехнологияТуризмФизикаФилософияФинансыХимияЧерчениеЭкологияЭкономикаЭлектроника






Dialog with marine mammals …and at least with one dog






 

Many poetic and philosophic texts, from Plutarch to Shakespeare, bring to our time everlasting humans’ desire to talk with dolphins, these friendly, social, intelligent but rather enigmatic creatures.

In the mid-1960s, Batteau began research into the possibilities of developing a Man/Dolphin Communicator. The pioneering studies began with a male bottlenosed dolphin, carrying out a preliminary research into the feasibility of producing man-to-dolphin and dolphin-to-man translators. Batteau used electronic whistles produced by a generator. These whistles, ranging between 4 and 12 kHz, were emitted by an automatic setup called Vocal-Trainer. Simultaneously the apparatus analyzed and compared the animals' responses with the signals emitted. By conditioning the animals he showed after a few months of experimentation that dolphins could be trained to associate a body motion with an underwater acoustical signal. Later they could be trained to reply with whistles to acoustic signals of great complexity and to imitate them, copying pitch contours very accurately.

A major flaw in this approach, however, was that individual sounds were not associated with individual semantic elements, such as objects or actions, but instead functioned as holophrases (complexes of elements). For example, a particular whistle sound instructed the dolphin to “hit the ball with your pectoral fin.” Another sound instructed the dolphins to “swim though a hoop.” Unlike a natural language, there was no unique sound to refer to “hit”, “ball”, “hoop”, “pectoral fin”, or any other unique semantic element. Hence, there was no way to recombine sounds (semantic elements) to create different instructions, such as “hit the hoop (rather than the ball) with your pectoral fin.”

Unfortunately, Batteau drowned during a morning swim in 1967. His death cut short what was the first established interspecies communication demonstration. The final report (Batteau and Markey, 1967) concluded that devices were constructed to translate articulated vowel sounds into sinusoidal whistles and to provide real time visual displays of the frequency modulated whistles. Two dolphins were found able to respond distinctly to 35 response-demand messages embedded in 5-word spoken sentences. However, because of the mentioned flaw in the approach to the construction of a language, the experiment was not a valid test of dolphin linguistic capabilities.

Herman and colleagues (Herman et. al. 1984, 1994, 1999; Herman 1980, 1986, 1990) study dolphins’ linguistic skills focusing on their language comprehension rather than on language production. They concentrated on doplhins’ receptive competencies, mainly on their capabilities of processing both semantic and syntactic information. The primary syntactic devise used in the studies was word order. Dolphins were shown to be capable of understanding that word order changes meaning.

Researchers taught two dolphins in parallel. One dolphin, Phoenix, was taught to respond to acoustic signals (short computer-generated noises), while the other, Akeakamai, was taught to respond to gestures (head and arm movements of a trainer by the pool). Both sets of signals were designed to act as a type of a language with lexical components (words) representing objects, actions and modifiers and a set of rules or syntax to combine the signals (grammar). The words that the dolphins learned allowed a large range of sentences to be generated and the animals’ comprehension was then tested by analyzing their response. For example, a sentence that means “take the ball to the Frisbee” should lead to a different response than “take the frisbee to the ball”, even though the same signals are used.

During the course of her training Akeakamai was tested with 193 novel sentences when all the 13 objects (such as a frisbee, a basket, a ball, a stream of water in the pool, a stationary speaker, and others, including the second dolphin) were in the pool so the objects provided no clue as to how Akeakamai should respond. The dolphin demonstrated understanding of the rules that structured the artificial language and the ability to use these rules to interpret novel sequences of signals. After experiencing many two- and three-word sentences, Akeakamai was suddenly given one containing four words, and she responded correctly. The implication of this finding is that she was able to use the rules relevant to the shorter sentences to understand a more complex syntactic structure. As Pearce (2000) gives this, this is precisely the sort of skill that should be demonstrated by an animal that is grammatically competent.

Akeakamai was demonstrated as being able to understand logical extensions of a syntactic rule spontaneously and to extract a semantically and syntactically correct sequence from a longer anomalous sequence of language gesture given by the human (Herman et al., 1994; Herman, 2002). To perform this extraction, the dolphin in some cases had to conjoin nonadjacent terms in the sequence. For example, the anomalous string glossed as “Water speaker frisbie fetch” violates syntactic rules in that there is no rule that accommodates three object names in a row. However, embedded in this sequence are two semantically and syntetically correct three-item sequences “Water frisbie fetch” (bring a frisbie to the stream of water) and “Speaker frisbee fetch” (bring the frisbee to the underwater speaker). In sequences of this type, the dolphin almost always extracted one or the other correct three-item sequences and operated on that implicit instruction. Herman (2002) suggests that the dolphin utilizes its implicitly-learned mental representation or schema of the grammar of the language, to include not only word-order rules but also the semantic rules determining which items are transportable and which are not (neither the stream of water nor the underwater speaker affixed to the tank wall could be transported). There was no explicit training given for these rules.

The dolphins have also met the displacement criterion at numerous trails. On some trials the instruction related to an object that was hidden from view (spatial displacement), and the dolphin had to find it before responding. On others the command was issued as much as 30 seconds before the related objects were thrown into the pool (temporal displacement). One version of this type of tests consisted in placing all the objects but one in the tank and then giving a command that related to the missing object. Akeakamai would often search for up to nearly a minute for the missing item and then stop, without responding to the other objects. She also rapidly learned to press a paddle to indicate that the designated item was missing.

Dolphins’ ability to understand whether things are present or missing gave a possibility to test whether they are capable of symbolic reference. For this purpose, Herman and Forestell (1985) constructed a new syntactic frame consisting of an object name followed by a gestural sign called as Question . For example, the two-item gestural sequence called as “basket question” asks whether a basket is present in the dolphin’s habitat. The dolphin could respond “yes” by pressing a paddle to her right or “no” by pressing a paddle to her left. Over a series of such questions, with the particular objects present being changed over blocks of trials, the dolphin was as accurate at reporting that a named object was absent as she was at reporting that it was present. These results gave a clear indication that the gestures assigned to objects were understood referentially by the dolphin, i.e., that the gestures acted as symbolic references to those objects.

In terms of language receptive competencies, Herman and his colleagues have shown dolphins capable of (1) successfully processing the semantic and syntactic features of a command system; (2) learning syntactic rules; (3) understanding novel sentences; (4) object labelling; (5) reporting; (6) independence of sensory modalities in learning elaborate commands, demonstrating linguistic comprehension.

The result from similar studies on sea lions which revealed much the same findings as with dolphins can be found in Schusterman and co-authors’ publications (Schusterman and Krieger 1984; Schusterman and Gisiner 1988; Schusterman et al., 2002). Similar to dolphins studies, the researchers focused their efforts on teaching three sea lions to relate particular gestural signals to objects (such as bats, balls, and rings), modifiers (large, small, black, and white), and actions (such as fetch, tail touch, and flipper-touch). These signals could be combined into over 7 000 different combinations, each instructing the animal to carry out a specific behavioral sequence.

A California sea lion named Rocky might have been able to learn to understand sentence forms similar to those understood by the dolphin Akeakamai. Rocky was able to carry out gesture instructions effectively for simpler types of sentences requiring an action to an object. The object was specified by its class membership (e.g., “ball”) and in some cases also by its colour (black or white) or size (large or small). Rocky was able to understand relational sentences requiring that one object be taken to another object. More complicated instructional sequences required the sea lion to press one of two paddles to indicate whether an object was present or absent. The most complicated instructions required the sea lion to select one object in the tank and bring it to another. These `relational' sequences could include up to seven signs; for example, the gesture sequence “large white cone, black small ball fetch” instructed the sea lion to bring the black small ball to the large white cone. The sea lions were eventually able to respond appropriately to familiar as well as novel combinations of signs with a great deal of accuracy.

These reports suggest that the sea lion was capable of semantic processing of symbols and, to some degree, of syntactic processing. A shortcoming of the sea lion’ work, however, was the absence of contrasting terms for relational sentences, such as the distinction between “fetch” (take to) and “in” (place inside of or on top) demonstrated for the dolphin Akeakamai. Additionally, unlike the dolphin, the sea lion’s string of gestures were given discretely, each gesture followed by a pause during which the sea lion looked about to locate specified objects before being given the next gesture in the string. In contrast, gesture strings given to the dolphin Akeakamai were without pause, analogous to the spoken sentence in human language. Further, Rocky did not show significant generalization across objects of the same class (e.g., different balls), but unlike the dolphin seemed to regard a gesture as referring to a particular exemplar of the class rather than to the entire class. Thus, although many of the responses of the sea lion resembled those of the dolphin, the processing strategies of the two seem to be different, and the concepts developed by the sea lion appear to be more limited than those developed by the dolphin (Herman, 2002).

It is interesting to note that at least one dog (a mongrel dog, Sofia) has displayed the ability to understand requests composed of two independent information items which had to be combined in the correct performance. Ades et al. (2005) conducted series of experiments on human-dog communication through arbitrary signals. The first experiment dealt with action-object sentences (such as “point ball”, “fetch bottle”), the second one with action-place sentences, in which action terms were verbal (‘point” and ‘fetch”) and place terms were gestural signals. Training consisted of a discrimination phase, in which verbal or gestural signals were associated to the relevant behaviours; a sequential request phase in which object and action (Experiment I) and place and action (Experiment II) requests were presented one at a time; and a simultaneous request phase, in which Sofia had to carry out action +object and object + place commands presented as sentences. Results indicates that Sofia and probably dogs in general has the ability to perceive and to keep in memory more than one request information, as chimpanzees and other linguistically trained animals do, but also there are limits to such arbitrary signal cognitive processing. A third experiment was performed to test Sofia’s ability to produce arbitrary signals through the use of an electronic keyboard with arbitrary geometric symbols for “water”, “food”, “cage”, “petting”, etc. Results indicate that Sofia pressed the keys in an intentional way, that is, according to motivational context and to presence of relevant stimuli. Use of the symbols was associated with glancing directed to the experimenter, an indication of communicative intent.

 

31. A BATTLE FOR ROSETTA STONE: ATTEMPTS TO DECIPHER ANIMALS’ SIGNALS

 

The Rosetta stone is the famous key to the ancient Egyptian language. It was carved in 196 B.C., and later was discovered by the French soldiers who came with Napoleon. The stone contained words in three types of writing: Egyptian hieroglyphs, Demotic, which is a shorthand version of Egyptian hieroglyphic writing, and Greek. By translating the Greek section, Jean Fransois Champollion and Thomas Young in 20-th of XIX Century were able to crack the code of the stone and basically deciphered what the hieroglyphs meant. Before that it was not possible to understand ancient Egyptian texts despite the abundance of texts in this language. Decoding an animal language is perhaps no simpler a task, and we can not hope to find a Rosetta stone.

The problems underlying the construction of animal’s dictionaries have been discussed in details during last decades (Theberge and Falls, 1967; Reznikova and Ryabko, 1990, 1993; Ryabko, 1993; Hauser, 2000). Many workers have tried to decode animal languages by looking for “letters”, “words” and “phrases” and by compiling “dictionaries”. With such an approach, it often remains unclear which sounds and gestures have something to do with the language and which do not, and there are also some technical difficulties connected with the high mobility of animals and often with their inaccessibility for recording signals (Ryabko and Reznikova, 1996).

Theberge and Pimlot (1969) noted that when studying wolfs’ ability to produce and distinguish subtle details of acoustic signals they were challenged by the problem of understanding a foreign culture’s sounds lacking a relevant dictionary and any ideas about this culture. Wolves form stable packs and, as many other social animals, they live in a world of dominance interactions. They use vocalisations extensively both when communicating with other pack members, and wolves in other packs. Researchers suggested that wolves were able to transfer the information by changing certain units of their acoustic communication, but the only “word” they managed to decipher was a “sound of loneliness” that wolves produce being placed in isolation, anxious to join others.

Since that, many distinct acoustic signals have been revealed in wolves, African wild dogs (Robbins, 2000), bottlenosed dolphins (Janik, 2000), primates (Snowdon et al., eds., 1982) and others. Acoustic vocalisation in some species of birds and mammals often has a hierarchical structure, with notes grouped into syllables, syllables grouped into phrases, and phrases grouped into a song with linear array of phrases. These data enable researchers to undertake efforts to understand the meaning of animal signals and to test whether species’ communications exhibit a language format.

For example, dolphins produce various types of sounds, including clicks, burst-pulse emissions, and whistles. Clicks are used for echolocation, burst-pulse sounds may indicate the dolphin’s emotional state, ranging from pleasure to anger, whereas whistles may be used for communication. During the 1960s, researchers attempted to determine whether the whistle vocalizations can be a form of language. Investigators recorded whistles from many dolphins in many different situations, but failed to demonstrate sufficient complexity in the vocalizations to support that dolphins are capable of referring symbolically to another individual, or to some other object or event in the environment (Herman, 1991). Although dolphins have demonstrated a great potential in the use of artificial intermediary language, there is no evidence that dolphins do have a natural language. Is it hopeless to decipher at least fragments of natural animal languages?

Animals often behave similarly in repeatable situations, and if these repeatable behaviours are elicited by the distinctive repeatable signals, these behaviours can serve as keys for cracking animals’ species-specific codes. Decoding the function and meaning of wild communications is a notoriously difficult problem. Up to now there are two natural communication systems which have been partly deciphered – acoustic communication of vervet monkeys and the “dance language” of honeybees. In both cases expressive and distinctive signals correspond to repeatable and frequently occurred situations in the context of animals’ life.

 






© 2023 :: MyLektsii.ru :: Мои Лекции
Все материалы представленные на сайте исключительно с целью ознакомления читателями и не преследуют коммерческих целей или нарушение авторских прав.
Копирование текстов разрешено только с указанием индексируемой ссылки на источник.