Note For Anyone Writing About Me

Guide to Writing About Me

I am an Autistic person,not a person with autism. I am also not Aspergers. The diagnosis isn't even in the DSM anymore, and yes, I agree with the consolidation of all autistic spectrum stuff under one umbrella. I have other issues with the DSM.

I don't like Autism Speaks. I'm Disabled, not differently abled, and I am an Autistic activist. Self-advocate is true, but incomplete.

Citing My Posts

MLA: Hillary, Alyssa. "Post Title." Yes, That Too. Day Month Year of post. Web. Day Month Year of retrieval.

APA: Hillary, A. (Year Month Day of post.) Post Title. [Web log post]. Retrieved from

Wednesday, June 21, 2017

Alyssa Reads: Critical Studies of the Sexed Brain -- Communication thoughts

I continue my thoughts from reading Critical Studies of the Sexed Brain. Because I had more and then forgot to put them up here. Go me.  Here's the citation again if you want it:

Kraus, C. (2012). Critical studies of the sexed brain: A critique of what and for whom?. Neuroethics,5(3), pp. 247-259.doi:10.1007/s12152-011-9107-7  

And now the quote that got me thinking:

Critical neuroscientists frame the question of a science gap between neuro- and social scientists, experts and the public, just as couple's guides conceive of the gender gap in terms of unawareness, misunderstanding, or ignorance, promoting the idea that all matters can be settled through enhanced communication and better knowledge of each other's distinctive language, culture, needs or concerns.”

This needs more attention paid to it. Here is a big issue: there is a power imbalance. Patriarchy is a word for the imbalance in the couple's guide, and it would relate to the sciences one too since hard sciences tend to be thought of as men's fields while social sciences are thought of more as women's fields. (Accuracy of this thinking is another issue, but STEM in general runs man-heavy.)

That contributes to the rhetorical positioning of the fields, where neuroscientific “facts” can't be questioned by social sciences, even if questioning the facts isn't exactly what's going on. Sometimes it's questioning the causes and interpretation of the reported result rather than questioning whether or not the result was correct, or reproducible. Though the fMRI study of a dead fish is relevant, and so is the fMRI of the same person daily for about a year – fMRI is not infallible, no more than any scientific procedure is, and pretending it is will get us into trouble.

The author then asks about “lay expertise” from patients, relatives, and activists. Since I'm studying neuroscience but came from the Neurodiversity Movement before I got into neuroscience, I wonder where that puts me. As a neuroscience student, I'm one of the science people. As an Autistic person, I'm somewhat a patient. (Not much of one, haven't been in therapy related to autistic traits for a while, but when I write as an Autistic person, I go in that category.) And there is definitely a power difference between the roles. There has to be, for Theory of Mind to have been interpreted to mean autistic people can't understand our own experiences. Not everyone making use of the word thinks that, but it's an interpretation I've seen way too much of.

The author then points to this framework as “preventative politics,” where it keeps the peace by avoiding/assuaging conflict in the name of interdisciplinarity. She argues this could prevent good science that would come from controversy. I'd agree, but also say that it can involve silencing of ideas that aren't status quo as part of the peacekeeping.

Another issue with the focus on communication is that it only works if everyone is acting in good faith. It's the same problem with Nonviolent Communication and similar: if everyone is acting in good faith, it works fine. If anyone involved is actually seeking to maintain control or to do harm, consciously or not, it's not going to work. If one person's goals actively exclude the other person's goals, better communication can lead to figuring this out, but not to solving the problem. Seeking to expand the domain of one's own field without worrying too much about the domain of anyone else's field could lead to a similar failure in interdisciplinary communication ideas.

Tuesday, May 30, 2017

Let's talk about fidget spinners and patterns.

Fidget spinners are a fad. Thinkpieces about fidget spinners, therefore, are also a fad. That's how it works, right? On one side, there's people who are arguing that these are toys (true), that they are a fad (true), that they can distract some people (true), that there is not research showing improved focus from their use (true), and that they are not an accessibility issue (false). On another side, there's people arguing that they are a focus tool for some autistic people and/or people with AD(H)D (true), that the lack of evidence is due to a lack of research and not a statement of inefficacy to use against individuals who find them useful (true), that this can be an accessibility issue (true), and that their fad nature among neurotypical students is bad (false) because it is getting the toys banned (mixed truth value). I've also seen more nuanced views, generally from disabled people, but those seem to be the two main camps.

I want to point out a pattern in how accessibility discussions go, especially in educational contexts.
  1. A disabled person needs something for access reasons.
  2. Abled people call the thing distracting, because our existence in public is apparently distracting.
  3. The thing is either banned entirely or permitted only for people with the paperwork to prove they need it for disability reasons.
  4. Disabled people who need the thing either don't have access to the thing or must out themselves as disabled in order to gain access. If outing oneself is required, the thing is heavily stigmatized.
  5. Disabled people who have an actual access conflict with the thing are erased entirely, which makes conversations about possible solutions to the access conflict impossible. One set of needs or the other will "win." Any disabled people who need to avoid the thing are lumped in with the people who want to ban the thing for ableist reasons and therefore vilified. Which set of needs "wins" here varies, but it usually has some relationship to hierarchy of disability stuff and having one set "win" while the other "loses" is a bad solution regardless.
That's not just a fidget spinner thing, but it does apply here. With fidget spinners, autistic people and folks with ADHD (I'd love to know of a reasonably recognized way of talking about this neurotype without the second D/in a neurodiversity paradigm way, btw) end up in both the "need the thing" and the "need to avoid the thing" groups. I assume some other neurotypes are similarly split as well - I just don't have the familiarity to assert so. With visual alerts on fire alarms, D/deaf people need the thing. Since the visual is a strobe, a lot of neurodivergent people, especially people with photosensitive epilepsy, need to avoid the thing. With service animals, the folks who use them need the thing. People with allergies need to avoid the thing, and not everyone with an allergy can safely share a space with a service animal, even if they are treating their allergies. Conflicting access needs exist, and this pattern prevents us from finding ways to deal with the conflicts. Instead, one access need gets lumped in with abled people who don't like the thing because it's associated with disability and therefore presumed not to be a real need.

Now for fidgets: some people need something to do with their hands while listening if they're going to retain anything. I am in this group, by the way. In high school, I knit, I sewed, and I made chainmail - armor, not spam. I've also tried drawing, which takes care of the "need to do something in order to sit" issue but takes enough attention that I'm no longer following the conversation, so that doesn't work for me in class. Writing hurts quickly enough that while taking notes has sometimes been possible at university, there was no way it was going to be the answer for the duration of a school day in middle or high school. (I, specifically, should not have a laptop in class. If I'm going to need notes it's the least bad option, but least bad does not mean good.) So I did assorted arts and crafts that were fairly repetitive and totally unrelated to class. The biology teacher who told us on day one that he had ADHD was both the most understanding teacher about my need to fidget somehow and the teacher most at risk of being distracted by my making armor in class.

That last paragraph is the "no, really, I need to fidget." It's also the "there are several fidget options that work for me." Most, but not all, of the standard fidget toys will meet my needs, as I discovered because they are also a fad and I got some awesome fidget toys. This is important, when access conflicts come into play - if there are several options that meet the access need of the first disabled person, it's easier to find one option that everyone is OK with. When there are several options that work, requesting "not option A in situation W" is not an access issue, because options B through H are still fine. If we're going to come up with reasons that each of B through H are also not fine, individually, then we're going to have a problem.

The fidget toy fad is making options D through H cheaper and cooler. When fidgets are marketed as assistive technology, they are super expensive. Considering that disabled people tend not to have a lot of money, that's an access issue, so the fad is making a set of possible solutions more accessible. That's cool. It's also leading to a sufficient presence for teachers to make explicit policies about the toys (as opposed to banning them person by person), and for a flat ban to seem like a good idea to teachers who are seeing kids appear distracted by them. (My bet is that the neurotypical students who appear distracted actually are. I expect the autistic and ADHD students who appear distracted are a mix of actually distracted because they are just as distractable as any other student and only appearing to be distracted because of ableist ideas about what paying attention looks like. Remember, I'd fail special needs kindergarten as a twenty-four year old PhD student.) The explicit banning for everyone is ... not so good. Mostly because the other options are usually also disallowed or heavily stigmatized, and then we may well be left with no good options.

And let's not pretend handing everyone a fidget spinner, or any other fidget, is going to magically "solve ADHD" or whatever. I think some of the camp that's firmly against the toys is reaching that position for similar reasons to haters of weighted vests - we hand it over and the person is still autistic, or still ADHD. A tool that a person uses to cope in a less than accessible environment doesn't make them stop being disabled by the environment. Plus a fidget spinner isn't going to help everyone. Some people really will be distracted if they have something to play with, and some of those people really will be neurodivergent. Conflicting access needs, again, are a thing. If one person needs a fidget, and another needs not to be next to someone with an obvious fidget, those two people probably shouldn't sit next to each other. Giving people fidgets that they can use while the toy remains in their pocket is also a possibility in some cases. We can have conversations about access conflicts, if we admit that both sets of needs exist. (We also need to admit that some subset of the people making arguments about distraction are doing the bad faith argument where everything disabled people need is a distraction because, essentially, our presence in public is a distraction.)

[Let's also insert a plug for my Patreon. I write. I have a Patreon.]

Saturday, May 20, 2017

"Your taste buds will change"

CN for food and vomit.

That's one of those sentences I read every so often, which is technically true, but which doesn't actually lead to the conclusions I see it used to support. Taste buds really do change with age! This is a thing that happens, and it's part of why there are certain foods kids tend not to like but which adults are more able to tolerate. (I think most alcoholic drinks go in this category, where kids tend not to like the taste anyways?)

As true as it is that tastes change, there's some things my brain has decided I need to explain now about why this doesn't mean getting into a power play with someone over what they eat and how they're "picky"  is a good idea.

  1.  You probably don't know what the result of "pushing the issue" is going to be. I don't just mean long term results. I mean short term, in the minutes to hours right after forcing the (in)edible object down. Obviously, you don't expect it to be a big deal, or else you wouldn't be trying to force a "picky" eater to eat something they can't eat. How wrong are you ready to be? TMI alert, last time I made myself drink something that was an issue, it came back up. (If it hadn't been something I was medically supposed to have, I wouldn't have tried. It still didn't work, because it didn't stay down.)
  2. The fact that someone's tastes may change and they may be able to eat a food later doesn't mean they can tolerate it now. The change hasn't happened yet. So even if you're correct about the nature of the upcoming change, you're still trying to make someone eat something they don't currently tolerate. See point 1.
    1. Also, even if you were going to be correct, you can cause that not to happen by creating an association between being forced to eat the food and whatever sensory issue it's hitting. That can create a new issue with the food in question, besides taste...
  3.  The issue may not be the taste. I can't drink anything carbonated. You might think that's a rather broad category for a taste issue. You'd be correct. It's not a taste issue. It's best described as a texture issue, and you've said nothing about texture sensitivities changing. In fact, most of the foods I can't deal with are texture issues, not taste ones.
  4. The changes in taste may not be the ones you expected or hoped for. Some foods that were issues before can become non-issues, but it can go the other way too. As a very small human, I could eat mushrooms. As an adult human, I can not eat mushrooms. (It's also the texture, not the taste.) Chocolate pudding was a "safe" food for me as a kid. It's about 50-50 on my being able to eat it now. (Texture again. Also, partially related to times when I didn't get the choice about yogurt, which has never been an OK texture and which is close enough to pudding that making yogurt even worse made pudding a problem. See point 2.1.) I ... actually can't think of any foods I can have now that I couldn't deal with as a kid. 
Tastes do change as we get older. That doesn't mean they'll change the way you want them to, or that a possible change that hasn't happened yet justifies acting as if it's already happened. 

Thursday, May 18, 2017

Alyssa Reads Critical Studies of the Sexed Brain

This is another one I read for neuroethics. I was considering using this article for my presentation on a neuroethics related topics, but that didn't happen because someone else split off my too-large group and it wasn't too big anymore. We actually wound up talking about a medication used to treat addiction ... that can itself be addictive. Fun times. So, here's some of my thoughts from reading Critical studies of the sexed brain. 

“They suggest that we work and talk across disciplines as if neuroscientists were from Mars and social scientists were from Venus, assigning the latter to the traditional feminine role of assuaging conflict” (247). sigh I am not surprised that some scientists think of social sciences that way.

Brain plasticity+ identity formation in intersex people, brains vs. genitals. That's going to be interesting. By which I mean, I have concerns. I have friends who are intersex. I know people who do intersex activism. And I know intersex people who concluded that intersex and/or nonbinary is their gender identity rather than picking one of the two binary genders. Hope the author isn't assuming a gender identity must be one of man/woman. Heck, mine isn't that and as far as I know, I'm not intersex.

Oi at calling autism a disease. It is a neurodevelopmental disability [or a neurotype, that's a good word and also let's remember what I'm saying when I say disability - the social model of disability is a thing.] Also I know the author found neurodiversity stuff because the article comes up when I search the journal for neurodiversity, what the heck? I don't expect to hear it called a neurotype in anything done by neurotypical(-passing) academics but really? Disease?

Ok, gender in the brain as a result of plasticity, that's going to be interesting – “reflect gendered behavior as learned and incorporated in a social context” is a thing, but please, please don't let this turn into “male socialization” for trans women or “female socialization” for trans men, or either of the above for nonbinary folks. The socialization of “consistently mistaken for X while actually Y” is not the same as the socialization of “X.” Ok, individual differences are a thing. That's good. “Plasticity arguments are extremely interesting as they wage war against both biological and social determinism, reductionism, essentialism, and other -isms.” Phew that's not the socialization argument I was worried about, I don't think.

Does she mean “cishet” by “normal people”? (Cishet=cisgender, heterosexual.) I appreciate the quotation marks around “normal people” but there probably is another word for what she means and using it would be nice.

Now we have one of my rage buttons. All caps time!

Intersex activist history! I knew about unwanted surgery, gender role training, and folks wanting their own intersex bodies back. I also know someone who was put on unwanted hormones. What are the results of Diamond getting so lauded while speaking in terms of brain sex, though? It's still the language coming from the people who try to enforce the man/woman dichotomy. What are the results of using the "sexed brain" discourse while not necessarily fitting in the binary? 

1 Walker, N. (September 27, 2014). Neurodiversity: Some basic terms and definitions. Neurocosmopolitanism: Nick Walker's notes on neurodiversity, autism, and cognitive liberty. [blog post] Retrieved from is a good explanation of the neurodiversity related vocabulary I tend to use when thinking about neuro stuff.

Thursday, May 11, 2017

Alyssa Reads Memory Blunting: Ethical Analysis- suffering and authenticity

I read "Memory Blunting: Ethical Analysis" by the President's Council on Bioethics, excerpted from Beyond Therapy: Biotechnology and the Pursuit of Happiness (2003) and appearing in Neuroethics: An Introduction with Readings, edited by Martha J. Farah. I did so because I am taking a neuroethics class and we're supposed to show that we're thinking about neuroethics stuff at least a bit outside class. Also because I'm super-interested in how neuro-stuff (especially neurodivergence but really all things neuro-) is represented in fiction (especially young adult speculative fiction.) I'm pretty much chucking my notes (drawn parallels, expressions of annoyance, and the occasional "OK that's legitimate") on my blog because as important as a lab notebook is, I like notes that are typed and searchable. I started with some connections to Allegiant, then some thoughts on collective effects of blunting trauma, and then cognitive liberty. Now here's suffering and authenticity.

The concerns about what we might do to others minds if it were an issue of what person X does/chooses for person X, not what we are choosing for others. The concern seems to be about changing someone's true self, so suffering and authenticity come in again, just like cognitive liberty. These two seem frequently connected to me. If we recognize that people get to define their own "true selves", we don't get to moralize over which experiences are real and true anymore, which kind of kills the "not their true self" argument. Which is an argument I'm really not a fan of, especially considering which experiences it tends to be applied to.

This quote ... gives me the noble suffering/virtuous suffering sort of feeling, where whatever positive you might (not will, might) drag from the hell you go through means you shouldn't try to avoid that hell or save others from going through it.
Or will he succeed, over time, in 'redeeming' those painful memories by actively integrating them into the narrative of his life. By 'rewriting' memories pharmacologically, we might succeed in easing real suffering at the risk of falsifying our perceptions of the world and undermining our true identity. (90)
The version of a person that went through more bad things isn't automatically more real. The version of a person that's suicidal from trauma isn't automatically more real than the version of a person that takes medication to not be suicidal. Our choices define us, not just what we've been through, and using chemicals to get the parts of our histories we never chose to back the heck off? That's not less real. Suffering isn't the only way to be real. Enough of the noble suffering narrative. Enough.

Now to bring back a quote that I also talked about with cognitive autonomy:
And yet, there may be a great cost to acting compassionately for those who suffer bad memories, if we do so by compromising the truthfulness of how they remember. We risk having them live falsely in order to cope, surviving by whatever means possible. (92)
  (Survival is resistance etc)

And the concerns about what happens if we take out everything difficult? Those take a huge slippery slope argument, and not the kind where we've seen from experience that most people stop early or don't stop at all (destructive obedience is one of those.) Trauma is not the same thing as everything difficult in a person's life. Having to spend a lot of time and effort on reading and writing in order to become a good writer is not the same as witnessing a murder or being mugged or being a victim of abuse. One of these things is a choice: we're not under any obligation to become good writers. The other's aren't choices. They're things that happen to us. How we deal with the results is at least partially a choice. (Not entirely. Especially when, due to technological or social constraints, dulling the pain while working through it isn't an option.) There is plenty of opportunity for hard work and achievement without forcing others to keep horrors in their heads for the sake of ill-defined authenticity.