Behavioral evolution. The field of neuroscience contains the tools that most interest me in terms of searching for patterns in the evolution of behavior. The nervous system – central and peripheral – guides all the other systems of the body through interaction with the world. From the primitive neural web of the cnidarian to the ganglia of the higher animal, it is interaction with the world that makes us what we are. I love neuroscience for attempting to associate corporeal and ethereal phenomena. I love seeking out the physiological correlates of consciousness, and the pathological correlates of behavioral dysfunction. I crave the abstraction of physiology into a medium for mind and consciousness. My own autonomic system excites when connections are made between the evolution of that medium and the evolution of the intelligence it propagates.
Monday, August 31, 2009
why neuroscience?
In preparation for meeting with the future of my academic career next week, I set about answering some silly interview questions. The first of these was, naturally, "why do you want to do research in neuroscience?" So I thought I'd share it, as this is the first time I think I've even put it coherently to myself...
Friday, August 28, 2009
on the changing face of science
I was inspired to pursue academic science by Santiago Ramon y Cajal, the man who discovered neuroplasticity. He was an academic, but first, a scientist and artist. At the end of the first book I read of him was his epitaph, which became the source of my conviction that real science, unfettered by the politics of academia, still exists.
"Scientists should be adventuresome people, restless and imaginative. They should be generous souls - poets at times, but always romantics - and they have two essential qualities. They scorn material gain and high academic rank, and their noble minds are captivated by lofty ideals."
In this process of bolstering my curriculum vitae to attract graduate schools, I am inundated with the mean face of academic politics. It demands that I be recognizable within the scientific community before I am even part of it. Academia must be able to identify me as unique before they accept the burden of my education, but this uniqueness is rightfully represented only by my publications.
While I am the first author of several abstracts, I have not yet published any articles. And while being a first author before having a PhD is impressive, it is significantly less so than being a co-author on a published article. I have a qualm with this particular rule of hierarchy - and with Academia's perspective on publication in general. To be published as a co-author in a scientific paper, you need be involved as little as collecting a key piece of data without having any clue as to its importance to the paper itself. To be a first author, you are either the primary investigator (head of the lab) or you have performed/analyzed/written a substantial portion of the piece. The latter is my case.
I recently mined a book on this subject by Thomas Bender, Intellect and Public Life: Essays on the Social History of Academic Intellectuals in the United States. Among other discussions is that of Academia's role in determining the evolution of science by publication, and how scientists are evolving to become defined by the magnitude - quality and quantity -of their publications instead of their contributions to scientific stride through their careers.
In Cajal's day (the late 18- and early 1900s), publication was a serious achievement, like Galileo's 550 printed copies of The Starry Messenger. Today, it is not unusual to expect an undergraduate student to have their name planted among a list of authors on a paper or two. This is a blessing and a curse. The circulation of scientific discovery has skyrocketed to a point where many articles are freely accessible to the public. However, credit for scientific discovery has become such a bilious diatribe that publications include up to ten co-authors, and the tiniest inkling of participation in the tiniest piece of the article gets your name on the list. Students are recruited to graduate schools based on their publications. Academic scholars are recruited to run labs based on quantity of publication alone, which may or may not sound absurd only to me. The physicist Richard Feynman was asked to head an engineering lab at Princeton because his name was on an unapplied patent that came out of Los Alamos. During the war, Los Alamos was flooded with some of the brightest minds in physics, and the opportunity was taken to exploit any random idea that popped into their heads. A lot of cool shit went down at Los Alamos.
I can't fairly condemn the evolution of publication's importance to Academia, nor Academia's reliance on publication to determine the worth of a scientist. My hope, however, is that quality does not become lost in quantity. I hope that my curriculum vitae communicates that my lack of publication does not represent my throughput or ingenuity...
I have spent over two years with my boss developing a huge project which has evolved from my own undergraduate thesis. The process has involved a scrupulous amount of project design, methodology, animal model development, endless amounts of research and, as goes without saying, an infinite amount of experimentation. As we have been too busy writing grant proposals and collecting alternative data, we have not yet published anything on the many results of this project. Fortunately, my boss has offered me the opportunity to co-author on a textbook chapter and set aside time to write an article on the novel mouse model I have developed before I submit my graduate applications. My boss reminds me of Cajal.
I maintain, sanguinely, that many graduate institutions still have the integrity to investigate the entire portfolio of a scientist before deeming woth investment. That ideal, however, may prove to be too lofty...
While I am the first author of several abstracts, I have not yet published any articles. And while being a first author before having a PhD is impressive, it is significantly less so than being a co-author on a published article. I have a qualm with this particular rule of hierarchy - and with Academia's perspective on publication in general. To be published as a co-author in a scientific paper, you need be involved as little as collecting a key piece of data without having any clue as to its importance to the paper itself. To be a first author, you are either the primary investigator (head of the lab) or you have performed/analyzed/written a substantial portion of the piece. The latter is my case.
I recently mined a book on this subject by Thomas Bender, Intellect and Public Life: Essays on the Social History of Academic Intellectuals in the United States. Among other discussions is that of Academia's role in determining the evolution of science by publication, and how scientists are evolving to become defined by the magnitude - quality and quantity -of their publications instead of their contributions to scientific stride through their careers.
In Cajal's day (the late 18- and early 1900s), publication was a serious achievement, like Galileo's 550 printed copies of The Starry Messenger. Today, it is not unusual to expect an undergraduate student to have their name planted among a list of authors on a paper or two. This is a blessing and a curse. The circulation of scientific discovery has skyrocketed to a point where many articles are freely accessible to the public. However, credit for scientific discovery has become such a bilious diatribe that publications include up to ten co-authors, and the tiniest inkling of participation in the tiniest piece of the article gets your name on the list. Students are recruited to graduate schools based on their publications. Academic scholars are recruited to run labs based on quantity of publication alone, which may or may not sound absurd only to me. The physicist Richard Feynman was asked to head an engineering lab at Princeton because his name was on an unapplied patent that came out of Los Alamos. During the war, Los Alamos was flooded with some of the brightest minds in physics, and the opportunity was taken to exploit any random idea that popped into their heads. A lot of cool shit went down at Los Alamos.
I can't fairly condemn the evolution of publication's importance to Academia, nor Academia's reliance on publication to determine the worth of a scientist. My hope, however, is that quality does not become lost in quantity. I hope that my curriculum vitae communicates that my lack of publication does not represent my throughput or ingenuity...
I have spent over two years with my boss developing a huge project which has evolved from my own undergraduate thesis. The process has involved a scrupulous amount of project design, methodology, animal model development, endless amounts of research and, as goes without saying, an infinite amount of experimentation. As we have been too busy writing grant proposals and collecting alternative data, we have not yet published anything on the many results of this project. Fortunately, my boss has offered me the opportunity to co-author on a textbook chapter and set aside time to write an article on the novel mouse model I have developed before I submit my graduate applications. My boss reminds me of Cajal.
I maintain, sanguinely, that many graduate institutions still have the integrity to investigate the entire portfolio of a scientist before deeming woth investment. That ideal, however, may prove to be too lofty...
Monday, August 24, 2009
ADHD News
Alright. I adore the Huffington Post. Truly. But this is the most ass-backwards half-written piece I think I've ever seen come out of there: ADHD Meds Abuse.
To paraphrase... "This non-profit study says this (kind of), but this big-pharma study says that (almost) - oh noes!... ... ..."
As a scientist, I take personal offense to this kind of writing. As a human being, I take umbrage to the lack of integrity in an article addressing an epidemic controversy such as ADHD. It's empty. There's not even an argument for or against the methodology of one study or another. There are no details as to how the study was conducted, no questioning the legitimacy of any of it, and the conclusion is that "the study lacks information on whether abusers were teens with ADHD, but anecdotal evidence suggests many are not."
... what? Really? Your conclusion is something that you left until the penultimate sentence to even bring up? And you're not going to expand on it? How the hell did Lindsay Tanner get the frontline with this thing? I'm almost more impressed with the public commentary... <shudders>
In effect, I think this article trammels the purpose of science writing, which is to translate primary literature into layman language. It is NOT to transliterate scientific discovery into utterances of empty and useless dribble. What is the purpose of this article? It can't possibly have had any other intent than to create hysteria, and that is the worst possible use of the media in general. First rule of translating: do not imply or directly suggest things that are completely irrelevant; please use intelligence.
I now firmly believe that if this Remicade business does not send me into remission and I am rejected from graduate school based on medical biases - yes, that is a legitimate possibility - I will have to go into science writing and do my utmost to revolutionize its currently upsetting condition.
To paraphrase... "This non-profit study says this (kind of), but this big-pharma study says that (almost) - oh noes!... ... ..."
As a scientist, I take personal offense to this kind of writing. As a human being, I take umbrage to the lack of integrity in an article addressing an epidemic controversy such as ADHD. It's empty. There's not even an argument for or against the methodology of one study or another. There are no details as to how the study was conducted, no questioning the legitimacy of any of it, and the conclusion is that "the study lacks information on whether abusers were teens with ADHD, but anecdotal evidence suggests many are not."
... what? Really? Your conclusion is something that you left until the penultimate sentence to even bring up? And you're not going to expand on it? How the hell did Lindsay Tanner get the frontline with this thing? I'm almost more impressed with the public commentary... <shudders>
In effect, I think this article trammels the purpose of science writing, which is to translate primary literature into layman language. It is NOT to transliterate scientific discovery into utterances of empty and useless dribble. What is the purpose of this article? It can't possibly have had any other intent than to create hysteria, and that is the worst possible use of the media in general. First rule of translating: do not imply or directly suggest things that are completely irrelevant; please use intelligence.
I now firmly believe that if this Remicade business does not send me into remission and I am rejected from graduate school based on medical biases - yes, that is a legitimate possibility - I will have to go into science writing and do my utmost to revolutionize its currently upsetting condition.
Sunday, August 23, 2009
on the roles of science
I've been thinking about how the role of scientific endeavor has morphed since the Scientific Revolution. What it's value has been, and what it has become.
One of our current controversial quests is to take advantage of what opportunity stem cells hold. Is the aspect of reality at stake here as great as that of Copernicus' heliocentric system or Giordano Bruno's infinite universe? Sciences battle with the church exists in the stem cell revolution, but is it as prevalent and restrictive as was the Roman Inquisition? The obvious answer is that no, it is not. Science has an easier time evolving in modern society because our paradigm since the Scientific Revolution has been to encourage the pursuit of knowledge. Since the breakdown and reconfiguration of the Church, theory has become a more friendly phenomenon. Theory then evolves into science much more naturally and with less turbulence. Scientific discovery glides into public gaze with less suspicion.
There are more scientists and, as such, more reason for the public to accept that what the scientific community promulgates. This is something that Thomas Kuhn does not talk about, and I wish he would have. He considers the role of individuals within the scientific community who are more bold than others, and who stimulate acts of revolutionary science. The roles of the other community members, however, are somehow uninteresting to Kuhns concepts.
The role of the scientific community, I think, is to speed up the process of insight. The scientific body gravitates toward one emerging framework or its opponent with the same outcome as the battle of superseding geniuses against the Roman Inquisition. Internal discordance now slows the progress of science instead of the fear of a shift in reality or religious allegiance. It's no less turbid than pre-Revolution, but certainly more urbane... there is no sabotage or torture in modern Science. I actually think that by handing the beast to itself, the Church has simply backed away from the recidivism of the tyrannical father and become the comforting grandmother to those who can't handle theoretical threats posed to reality.
So has it become easier for science to thrive since the Revolution? Yes. Has new knowledge become globally welcome instead of globally feared? To those territories who experienced a Scientific Revolution, yes. Has the quelling of that global fear opened new doors for the free emergence of scientific revolution? Perhaps.
Does revolutionary scientific discovery hold the same earth-shattering importance that it once did? Not really.
Does an individual scientist suggest as much importance to the growth and sustenance of the world as they did during the Renaissance? Well...
I like to hope that we do. But I will need more convincing as my career proceeds.
One of our current controversial quests is to take advantage of what opportunity stem cells hold. Is the aspect of reality at stake here as great as that of Copernicus' heliocentric system or Giordano Bruno's infinite universe? Sciences battle with the church exists in the stem cell revolution, but is it as prevalent and restrictive as was the Roman Inquisition? The obvious answer is that no, it is not. Science has an easier time evolving in modern society because our paradigm since the Scientific Revolution has been to encourage the pursuit of knowledge. Since the breakdown and reconfiguration of the Church, theory has become a more friendly phenomenon. Theory then evolves into science much more naturally and with less turbulence. Scientific discovery glides into public gaze with less suspicion.
There are more scientists and, as such, more reason for the public to accept that what the scientific community promulgates. This is something that Thomas Kuhn does not talk about, and I wish he would have. He considers the role of individuals within the scientific community who are more bold than others, and who stimulate acts of revolutionary science. The roles of the other community members, however, are somehow uninteresting to Kuhns concepts.
The role of the scientific community, I think, is to speed up the process of insight. The scientific body gravitates toward one emerging framework or its opponent with the same outcome as the battle of superseding geniuses against the Roman Inquisition. Internal discordance now slows the progress of science instead of the fear of a shift in reality or religious allegiance. It's no less turbid than pre-Revolution, but certainly more urbane... there is no sabotage or torture in modern Science. I actually think that by handing the beast to itself, the Church has simply backed away from the recidivism of the tyrannical father and become the comforting grandmother to those who can't handle theoretical threats posed to reality.
So has it become easier for science to thrive since the Revolution? Yes. Has new knowledge become globally welcome instead of globally feared? To those territories who experienced a Scientific Revolution, yes. Has the quelling of that global fear opened new doors for the free emergence of scientific revolution? Perhaps.
Does revolutionary scientific discovery hold the same earth-shattering importance that it once did? Not really.
Does an individual scientist suggest as much importance to the growth and sustenance of the world as they did during the Renaissance? Well...
I like to hope that we do. But I will need more convincing as my career proceeds.
Thursday, August 20, 2009
on the emergence and immersion of geniuses
R. Buckminster Fuller is an American architect who is known for patenting the geodesic dome design, and for the discovery of the carbon allotrope which is his namesake. He is - probably more accurately - known for the saying; "everyone is born a genius. Society de-geniuses them."
I think that Fuller's own history is a perfect example of his claim. I also agree strongly with this particular statement. For instance, the C60 allotrope known as the Buckyball or Buckminsterfullerene was actually discovered by Robert Kurl, Sir Harold Koto and Richard Smalley in 1985 . Their 1996 Nobel Prize gives these men their due credit, but in the limpid word of chemistry, the R. Buckminster Fuller is wrongly assumed to be a chemist, and the discoverer of Buckminsterfullerene. The point I aim to make here is that Fuller was incontestably a genius, but he is not known as such for the correct reasons.
Socioeconomic evolution has imbued the term "genius" with melange of defining characteristics with which I don't agree. Before I get in too deep with this assertion, I should clarify that I define genius as one whose unique curiosities become manifest. That's it. Psychometrics has tried admirably to find a way to assess intelligence on a relative scale, allowing the siting of geniuses and savants as they arise. In as much as these tests do have some value to designing our current educational system (with which I also have major discordance), they also serve as a societal breech to the acceptance of the creativity from which genius comes.
I am inclined toward the idea that genius emerges from reaching outside of paradigms, and happens independently of academic guidance. Fuller's genius is identifiable in his childhood endeavors into original architecture, tool design and novel propulsion methods; he was twice expelled from Harvard. Nikola Tesla invented the first paddle-less water wheel at the age of 4; he spent 1 term at the Charles-Ferdinand University in Prague. Francisco Goya; the last of the Old Masters and the first of the Modern. Mozart and Galileo... all geniuses oppressed or quelled by economic demand, marketing and immersion into societal constructs.
I supposed this is my way of letting out some steam from the pneumatic build up of my anxiety and ambivalent contention for graduate school.
I prefer to hope that I have something unique to offer the world of intellect and scientific discovery. While I was never a prodigal child, and have most definitely been the product of academia-induced ADD-promulgated study, there is still a chance that some modicum of genius might arise in the later stages of my own evolution.
In short, it is my profound hope that I am playing out Fuller's apothegm in reverse - that I have first been de-geniused by society, and am now in the midst of my journey toward emergence.
I think that Fuller's own history is a perfect example of his claim. I also agree strongly with this particular statement. For instance, the C60 allotrope known as the Buckyball or Buckminsterfullerene was actually discovered by Robert Kurl, Sir Harold Koto and Richard Smalley in 1985 . Their 1996 Nobel Prize gives these men their due credit, but in the limpid word of chemistry, the R. Buckminster Fuller is wrongly assumed to be a chemist, and the discoverer of Buckminsterfullerene. The point I aim to make here is that Fuller was incontestably a genius, but he is not known as such for the correct reasons.
Socioeconomic evolution has imbued the term "genius" with melange of defining characteristics with which I don't agree. Before I get in too deep with this assertion, I should clarify that I define genius as one whose unique curiosities become manifest. That's it. Psychometrics has tried admirably to find a way to assess intelligence on a relative scale, allowing the siting of geniuses and savants as they arise. In as much as these tests do have some value to designing our current educational system (with which I also have major discordance), they also serve as a societal breech to the acceptance of the creativity from which genius comes.
I am inclined toward the idea that genius emerges from reaching outside of paradigms, and happens independently of academic guidance. Fuller's genius is identifiable in his childhood endeavors into original architecture, tool design and novel propulsion methods; he was twice expelled from Harvard. Nikola Tesla invented the first paddle-less water wheel at the age of 4; he spent 1 term at the Charles-Ferdinand University in Prague. Francisco Goya; the last of the Old Masters and the first of the Modern. Mozart and Galileo... all geniuses oppressed or quelled by economic demand, marketing and immersion into societal constructs.
I supposed this is my way of letting out some steam from the pneumatic build up of my anxiety and ambivalent contention for graduate school.
I prefer to hope that I have something unique to offer the world of intellect and scientific discovery. While I was never a prodigal child, and have most definitely been the product of academia-induced ADD-promulgated study, there is still a chance that some modicum of genius might arise in the later stages of my own evolution.
In short, it is my profound hope that I am playing out Fuller's apothegm in reverse - that I have first been de-geniused by society, and am now in the midst of my journey toward emergence.
Thursday, August 6, 2009
chris kelly + the constitution
On August 4th, the day before President Obama's birthday, Chris Kelly, a writer for the Bill Maher show, wrote a piece on the authenticity of our president as determined by our Constitution, and on the practicality of our Constitution. (that was a commariffic sentence!)
When interpretation is called into question, I almost immediately become bored, feeling that the inherent flaws in relying on interpretation as truth is fragmenting and destructive in itself. There is no direct or obvious interpretation of anything, and so the subject of flawed interpretation is spoiled and dull.
However. I was inspired to propose a response to Kelly's article (to be interpreted as one would perceive Kelly's article). I think that the United States Constitution should be rewritten on a quarterly basis. And by quarterly, I of course mean every 25 years. Additionally, if nobody wants obfuscated avenues of potential conclusion, shouldn't the Constitution be written as a series of premises? "This, That, Therefore such and such at such and such a time, place and intensity." Eh?
I say 25 years because really, how much social evolution can take place in less than that time? The emergence of each new generation seems like the appropriate time to reevaluate norms. The kind of growth that requires a reformation of the nature of the rules is really of a greater ilk than what happens on an annual scale. For instance, the results of great legal cases (which are the brunt of what determine amendments to our current Constitution) should be considered every 25 years, and a consensus should be taken as to which should be implemented in the the Constitution and contribute to the growth of National Law.
How much easier this would be. The clarity - the insane acceptibility. The absence of religious tongue-tying and the degree of misrepresentation and misconstrued allusion! I think the Constitution should be written for toddlers. Clearly, this is the only way the United States will be able to agree with itself as to "what the fathers wanted". Come, now... WE are now the fathers. We beget the carriers of the next stage of humanity's evolution. We are building for posterity. Let us build on foundations that will uphold our direction and progress, not the rickety and wizened inspirations that allowed our nation to become what we have.
This being said, there is a reason I never went into politics. Call me an elitist - I prefer for the decisions to be made my people who actually understand what is going on, what is implied and what is realistic. The populous should be responsible for finding someone trustworthy, and for communicating concerns at a State level. Beyond that, please oh please let the politicians decide whether Obama is trying to kill our grandmothers or not. I hate town halls. If you're going to lobby (and don't be mistaken - I certainly have done my share), please do so in a polite and intelligent fashion. Picketing is for children and hippies.
Wednesday, August 5, 2009
penrose + consciousness
Sir Roger Penrose is one of my favorite fellows. He's a mathematician who uses his intelligence to design obscure puzzle patterns simultaneously denying the emergence of consciousness from a pattern for god's sake - what more could you want in a man?
He uses an entire book - Shadows of the Mind (the "sequel" to The Emperor's New Mind) - to discuss how consciousness can't arise from a consistent mathematical system. He begins, essentially, with E=mc2: a number of superposed quantum states in the brain "work" until there is a gravitational difference between their energy and mass. The gravitational significance of this difference causes the states to collapse - or unfold - into one. This single conglomerate state then becomes one observable in the gross physical world - as an action potential in a neuron, perhaps.
At this point, Penrose would seem to agree with David Bohm's implicate order: actuality is the result of probabilistic collapses/unfolding of quantum and subquantum states. We observe light similarly. When medium of a laser absorbs particular wavelenths of light, the electrons of its atoms elevate to their highest state of energy. When so many electrons are excited to this high energy state (population inversion), they collapse together to a lower energy level which results in the emission of light, an observable condition created from a quantum conglomeration.
I have always been somewhat fearful of humanity's discovering the substrate of consciousness and applying it to artificial intelligence. It is comforting that Penrose agrees with me (!) that we will not be able to design A.I. with consciousness in the foreseeable future because it's something we are not nearly close enough to understaning, not to mention being able to coalesce and manipulate.
It is possible to suggest, then, that the connect between consciousness and brain is a physiological exploitation of the vast magnitude of activity in collapsed quantum states. This "non-algorithmic ingredient", as Penrose coins it, also jives with Bohm's suggestions of probability's role in the playing-out of the quantum universe in the gross or actual universe. Could we give our fatty brains such credit as to be the medium by which quantum states become consciousness? This is a question Penrose explores in The Emperor's New Mind, and which I will not dare attempt to disect.
Not at the moment, anyway.
More to come, then.
Subscribe to:
Posts (Atom)