Template:Featured Article Candidate: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Caesar Schinas
m (remove bullet)
imported>Daniel Mietchen
(extend period of eligibility for the time being, since we have so few nominations)
 
(20 intermediate revisions by 2 users not shown)
Line 1: Line 1:
<includeonly>{{rpl|{{{1|}}}||}}
<includeonly>|- {{#if:{{{created|}}}| {{#ifexpr:{{#time:Ymd|{{{created}}} +360 days}} < {{#time:Ymd}} | style="background:#fcc;" }} }}
 
| style="vertical-align:top; padding:0.5em 1em;" | {{rpl|{{{article}}}||}}<hr style="backgound:none; border:dotted #ccc; border-width:1px 0 0;" />{{:{{{article}}}}}&nbsp;<span style="font-size:smaller; font-weight:bold; white-space:nowrap;">(''[[{{{article}}}|Read more...]]'')</span>
{{Winner|{{{1|}}}}}</includeonly><noinclude>{{TlDoc}}</noinclude>
| style="vertical-align:top; padding:0.5em;" | {{{supporters|}}}
| style="vertical-align:top; padding:0.5em;" | {{{specialists|}}}
{{#ifeq:{{BASEPAGENAME}}|New Draft of the Week|
{{!}} style="vertical-align:top; padding:0.5em;" {{!}} {{#if:{{{created|}}} | <p>Created:<br />{{{created}}}</p><p>Last eligible:<br />{{#time:Y-m-d|{{#time:Ymd|{{{created}}} +60 days}} }} {{#ifexpr:{{#time:Ymd|{{{created}}} +60 days}} < {{#time:Ymd}} | <br />('''Expired!''') }}</p> | '''Please add the date of creation.''' }}
}}
| style="vertical-align:top; padding:0.5em;" | {{{score|}}}
</includeonly><noinclude>{{TlDoc}}</noinclude>

Latest revision as of 09:00, 22 October 2010

This documentation is transcluded from Template:Featured Article Candidate/doc (edit | history)

This template is for displaying candidate articles at CZ:Article of the Week and Archive:New Draft of the Week.
Once a winning article has been chosen it is displayed using {{Featured Article}}.

Usage

Step 1 - Changes to the candidate article

For both this template and {{Featured Article}}, it is neccesary to make some changes to the candidate article itself.

  1. Place <onlyinclude>...</onlyinclude> tags around the section of the article which is to be featured - normally the first paragraph or two.
  2. Place <noinclude>...</noinclude> tags around anything within this section which should not be shown in the featured article.
  3. Place <includeonly>...</includeonly> tags around anything within the onlyinclude section which should only be shown in the featured article, and not on the article page itself.

The following are examples of changes which must be made to the section of the article which will be used :

Heading sizes
headings must always be reduced by two sizes in the included section, by adding <includeonly>==</includeonly> before and after as follows:
<includeonly>==</includeonly>==Heading==<includeonly>==</includeonly>
Image sizes
Images should generally be no bigger than 250px or so in the featured section of the article. To have small images when featured but large ones on the article page itself, do something like this:
{{Image|Image name.jpg|right|<noinclude>500px</noinclude><includeonly>250px</includeonly>|Image caption}}
Infoboxes
Infoboxes should never be included in featured article. Place <noinclude>...</noinclude> tags around them.

Step 2 - Adding the article to the list

The candidate article can then be added to the list at CZ:Article of the Week using the following code:

{{Featured Article Candidate
| article     = 
| supporters  = 
| specialists = 
| score       = 
}}

Or to Archive:New Draft of the Week using the following code:

{{Featured Article Candidate
| article     = 
| supporters  = 
| specialists = 
| created     = 
| score       = 
}}

Example

When used within the Article of the Week table, the following code:

{{Featured Article Candidate
| article     = Music perception
| supporters  = 
| specialists = [[User:Daniel Mietchen|Daniel Mietchen]] 17:24, 4 June 2009 (UTC)
| score       = 3
}}

gives the following result:

Nominated article Supporters Specialist supporters Score
Developing Article Music perception: The study of the neural mechanisms involved in people perceiving rhythms, melodies, harmonies and other musical features. [e]
Processing a highly structured and complex pattern of sensory input as a unified percept of "music" is probably one of the most elaborate features of the human brain. In recent years, attempts have been made to investigate the neural substrates of music perception in the brain. Though progress has been made with the use of rather simplified musical stimuli, understanding how music is perceived and how it may elicit intense sensations is far from being understood.

Theoretical models of music perception are facing the big challenge to explain a vast variety of different aspects which are connected to music, ranging from temporal pattern analysis such as metre and rhythm analysis, over syntactic analysis, as for example processing of harmonic sequences, to more abstract concepts like semantics of music and interplay between listeners' expectations and suspense. It was tried to give some of these aspects a neural foundation which will be discussed below.

SoundEarSource separationPitchMetreRhythmLyricsMelodyHarmonyConsonance/DissonanceMusical syntaxMemoryEmotionMotor controlMeaning
A modular framework of music perception in the brain, after Koelsch et al. and Peretz et al.

Several authors have proposed a modular framework for music perception [1][2]. After Fodor, mental "modules" have to fulfil certain conditions, among the most important ones of which are the concepts of information encapsulation and domain-specificity. Information encapsulation means that a (neural) system is performing a specific information-processing task and is doing so independent of the activities of other modules. Domain-specificity means that the module is reacting only to specific aspects of a sensory modality. Fodor defines further conditions for a mental module like rapidity of operation, automaticity, neural specificity and innateness that have been debated with respect to the validity for music-processing modules.

However, there is evidence from various complementary approaches that music is processed independently from e.g. language and that there is not even a single module for music itself, but rather sub-systems for different relevant tasks. Evidence for spatial modularity comes mainly from brain lesion studies where patients show selective neurological impairments. Peretz and colleagues list several cases in a meta-study in which patients were not able to recognize musical tunes but were completely unaffected in recognizing spoken language[2]. Such "amusia" can be innate or acquired, for example after a stroke. On the other hand, there are cases of verbal agnosia where the patients can still recognize tunes and seem to have an unaffected sensation of music. Brain lesion studies also revealed selective impairments for more specialized tasks such as rhythm detection or harmonical judgements.

The idea of modularity has also been strongly supported by the use of modern brain-imaging techniques like PET and fMRI. In these studies, participants usually perform music-related tasks (detecting changes in rhythm or out-of-key notes). The obtained brain activations are then compared to a reference task, so one is able to detect brain regions which were especially active for a particular task. Using a similar paradigm, Platel and colleagues have found distinct brain regions for semantic, pitch, rhythm and timbre processing [3] .

To find out the dependencies between different neural modules, brain imaging techniques with a high temporal resolution are usually used. These are e.g. EEG and MEG which can reveal the delay between stimulus onset and the processing of specific features. These studies showed for example that pitch height is detected within 10-100 ms after stimulus onset, while irregularities in harmonic sequences elicit an enhanced brain response 200 ms after stimulus presentation[1]. Another method to investigate the information flow between the modules in the brain is TMS. In principle, also DTI or fMRI observations with causality analysis can reveal those interdependencies.  (Read more...)

Daniel Mietchen 17:24, 4 June 2009 (UTC) 3
  1. 1.0 1.1 Koelsch, S.; Siebel, W.A. (2005). "Towards a neural basis of music perception". Trends in Cognitive Sciences 9 (12): 578-584. DOI:10.1016/j.tics.2005.10.001. Research Blogging.
  2. 2.0 2.1 Peretz, I.; Coltheart, M. (2003). "Modularity of music processing.". Nat Neurosci 6 (7): 688-91. DOI:10.1038/nn1083. Research Blogging.
  3. Platel, H.; Price, C.; Baron, J.C.; Wise, R.; Lambert, J.; Frackowiak, R.S.; Lechevalier, B.; Eustache, F. (1997). "The structural components of music perception. A functional anatomical study". Brain 120 (2): 229-243. DOI:10.1093/brain/120.2.229. Research Blogging.