Showing posts with label Vanessa. Show all posts
Showing posts with label Vanessa. Show all posts

Friday, 6 December 2013

Reflections on Research Development

I've found that I haven't made a lot of changes to my research question over the course of the semester. I think a lot of that is because this is a topic I've been thinking about researching for a couple of years. As I discussed in a previous post, I spent a lot of time at the beginning of the course thinking about how to make the topic more practical and adapting my research question to suit what I've learned at the iSchool.

However, my topic is a relatively exploratory one, and my research question is a little broad. My methodology, unstructured interviews, allows for some adaption during the course of the study to suit whatever I encounter at the time. So, perhaps, in the end I've left myself room to make last minute changes. As a fairly indecisive person I do have a tendency to try to keep my options open as long as possible, and I think it's interesting to consider how that aspect of my personality might have influenced the design of my research plan. Now that you've reflected back on the way your research developed, have any of you noticed a similar trend?

Friday, 29 November 2013

Peer Review - Ignored?

Earlier this year I was discussing John Bohannon's "experiment" with the peer review process of open access journals (CBC News, 2013) with a friend of mine in the sciences. The "sting operation" he performed has been criticized as biased since then (Taylor, Wedel & Naish, 2013), and was published largely as a news item even then, so it's clearly not an ideal examination of the peer review process. However, what we ended up discussing was her experience as a reviewer, as she had recently been asked to perform a review in place of her busy supervisor.

When my friend was reviewing her first ever pier-reviewed paper she felt terribly guilty for being highly critical of it, but she found it essentially flawed and didn't recommend it for publishing. As a participant in the experimental process she knew how much work had gone into the paper, and felt bad for the scientists involved, but she knew that her duty as a reviewer was to be critical and honest about adherence to the scientific process and the quality of research. After agonizing over the review, she was very surprised later on to find that the paper she thought she had condemned to rewriting had been published despite her input. She also told me that many of her colleagues had reviewed papers in the past that they deemed not fit for publishing, and had seen them go on to be published in peer-reviewed journals. Further, all of the members of her lab review for non-open-access journals.

While it's true that the peer review process involves multiple reviewers and the input of a single scientist can't be taken as the sole possible view of a paper's worth, this raises concerns about the validity of the peer review process even when it is working as originally designed in a closed-access journal of repute. While there is much debate about the potential harm that can be done when changing the peer review process, there are still many questions that can be asked about the quality of the traditional process as it is implemented today. How many of the reviewers are qualified to perform the review - and how many of them pass it on to a less experienced assistant or student working under them? How many of the reviewers approach the review process seriously and pay close attention to the details of the paper? How well do editors and selection committees adhere to the advice the reviewers give? And how well does the peer review process truly represent the opinions of the scientific community they are meant to embody?

(2013, Oct 14). Bogus science paper reveals peer review's flaws. CBC News. Retrieved from http://www.cbc.ca/news/technology/bogus-science-paper-reveals-peer-review-s-flaws-1.2054004
Taylor, M., Wedel, M., & Naish, D. (2013, Oct 7). Anti-tutorial: how to design and execute a really bad study. Retrieved from http://svpow.com/2013/10/07/anti-tutorial-how-to-design-and-execute-a-really-bad-study/

Friday, 22 November 2013

Files, physical and digital

I'm a person who likes to keep things. When I was about fifteen I saw a filing cabinet outside a junk shop and I knew I wanted it right away, because I loved organizing things and the idea of keeping all my things neatly filed pleased me a lot. I still have that same filing cabinet, and I've only recently started using the second drawer for files - for a long time it was just for storing art supplies. That's because shortly after getting my first filing cabinet, I got my first personal computer, and I was able to save absolutely everything in digital form.

Over the years, I've had many different computers and I've reorganized both my physical and digital filing systems several times. Every time I move to a new house or a new computer I transfer all my files, bring along my legacy data and set it up so it makes sense with any new additions. I've spent many hours creating elaborate and logical file hierarchies to make years of documents easy to find - at least for me. While I've had some trouble squeezing my filing cabinet into my tiny Toronto apartment, I've never had a problem making space for old files on a new computer. Digital storage is constantly expanding, and cloud storage is increasing that effect exponentially. Physical space is much more limited than digital space, and I can't imagine a constraint that could stop me from hanging onto the digital copies of old school assignments that feel so important - although I haven't looked at them in years.

I know there's a flaw in my system, though, and if I'd like to keep good records of my research and work it's something I'll have to change. Right now, I keep files in their original format - paper in my filing cabinet, word documents on my hard drive and an external backup, and collaborative work on a cloud drive. While I have a few backups in case of problems, the best way to ensure something is kept safe is to have multiple formats. I don't imagine I'll start printing out my powerpoint presentations, but it would be much better practice to keep digital files both on a hard drive and in cloud storage. More importantly, digitizing my paper files would make it much easier to ensure they're preserved and available when I need them.

Thursday, 14 November 2013

Artifacts in Context

My proposal topic for the class is somewhat on the edge of a study of texts and artifacts, as it looks at how a certain group of people interact with a class of texts (Medieval manuscripts). I suppose I framed my research largely from the perspective of studying a text or object because my background in English was so focused on that practice. In fact, my research proposal feels like a major departure from the study of texts to me, since the main focus of the research is on how specific users perceive the text.

When I think about the study texts and artifacts and what they can reveal I remember the major study I worked on for my Chaucer course as an undergraduate, which I didn't have nearly enough time or resources to complete. My original goal was to study "translations" of Chaucer's Canterbury Tales into Modern English, and understand what the choices made by the translators and publishers revealed about the context of their creation, and perceptions of the potential reader of the translation. It was a very ambitious topic, and I eventually scaled it down to look only at the framing of the texts, through prefaces, forewords, and textual organization.

Even with the changes I made, I found that the project was a bit outside the scope of my capabilities at the time, and it's always felt like a bit of an unfinished topic to me. I think that's why I chose an adaption on the theme for my topic in this course. For me, the most interesting thing to study about a text or artifact is how it is perceived and received, how it is produced and presented based on assumptions about the audience, and the possible discord between the two realities. It's probably a major influence from my English training, but studying context and production seem to be the most interesting possibility for me.

Friday, 8 November 2013

Interdisciplinary Research Methods

Last year I took a course from the Knowledge Media Design institute that focused on interdisciplinary approaches to research and academic practice. It was a novel experience for me, despite the increase in interdisciplinary research and programs throughout academia. While I've managed to cobble together a relatively interdisciplinary education through courses in the sciences, social sciences, and humanities, I haven't had many chances to take part in courses or programs aimed at broaching the barriers between academic fields. What I noticed in the KMDI course was that many students had trouble reconciling the concepts and practices discussed in the course with their own work, and many of the final projects were grounded heavily in one of the two with slight tribute paid to the other. My observations in KMDI follow with my own experience in research methods education across different disciplines, and in light of the increase in interdisciplinary studies I find it concerning.

I have previously learned research methods in a Psychology program, and I found the course very specific to the discipline. Emphasis was placed on ethics and ethical review procedures - understandable in light of the history of psychological research - and on quantitative data and statistical methods. The research methodology we were taught seemed to be very focused on the science of psychology and numerical data, and designing experiments in a very non-salsa dancing manner. A calculator was required for both of the examinations. In fact, a second research methods course from the same department focused entirely on teaching students how to use APA formatting and a statistical software package. While these may be useful tools for pure psychological researchers, there was very little exposure to concepts of selecting research topics, choosing from an array of methodologies, or incorporating influences from outside psychology.

My experience with biology was very different in that there were no courses directly focused on research methodology. Instead, every course with a laboratory or field component exposed students to the experimental process and different methods of analysis. It seemed that the expectation was for students to learn research methodology through inference and laboratory work, and use the statistical methods from math courses to support their analysis of data.

Studying English, I was exposed to different theoretically frameworks and approaches to study in seminar courses, but had little education or exposure to methods for performing research. While research methods may not be a priority in the humanities, it seemed again in English that there was an assumption students would learn by inference how to perform analysis or find resources for their work. I did notice, however, that of the three fields English provided the most opportunities to incorporate other disciplines or integrate theory and methods from outside the field. While there was very little education in how to do so,space was made for using interdisciplinary approaches.

Our current research methods course is the most open to interdisciplinarity I have encountered, which makes sense in the essentially interdisciplinary information field. However, for the most part in my experience, the approach to research methods education is highly siloed and traditional. There may be research and methods designed specifically for interdisciplinary endeavors, but those are not the methods or approaches taught to students. This approach does not prepare students and future academics for the interdisciplinary work they will likely take in the future, and further leaves them unprepared to discuss research with scholars outside their field. The first exposure to concepts of research and experiments and methodology bias and influence a student's future understanding of what is academically valid. If the approaches to what research means are so inherently different across fields, how can a group of researchers discuss and mutually design research work?

Thursday, 31 October 2013

Persuasion through Visualization

Although it's taken me a few days to make this post, I knew what I wanted to talk about as soon as I read the blog question for this week. It's been five years, but I was so struck by the effective use of statistics and visualization that I remembered these maps right away. I'm talking about images produced by Mark Newman in response to U.S. voters who were baffled by the contradiction between the voting outcome maps after the presidential election, and the voting numbers. At the time, many voters who favoured the Republican party claimed that the majority of the U.S. was red on election maps, and that small states had disproportionately large numbers of votes, unfairly swaying the election toward a Democratic outcome. This represents an essentially flawed understanding of the statistics that come out of elections, the set-up of voting regions, and the meaning of the U.S. electoral college. People could have debated for a long time as to why citizens didn't understand the election process or returns, or argued endlessly over numerical data. But in the end, many people simply can't conceive of what numbers mean, and the bias against "word problems" in math shows that turning the statistics into sentences isn't much help. For many people, seeing is believing, and graphical displays of data are what truly bring home the point of a study or article. Newman's illustrations make it incredibly clear why the election gave the result that it did, and why electoral seats are distributed as they are, more easily than pages of explanation could have done. While I've seen many interesting infographics in the years since, that series of maps has stuck with me since 2008.

References
Newman, M. E. J. (2008, November 11). Beyond Red and Blue: 7 Ways to View the Presidential Election Map. Scientific American.

Friday, 25 October 2013

Notes on Fieldwork from the Mud

My ideas about field work mostly come from my experiences in studying Biology during my undergraduate degree. It's possible my ideas of what makes something the field are more traditional because of this, but for me they're tied up in ideas of both location and the ability to control variables.

Most of my courses had a laboratory component of some kind. Most of those courses involved working in a lab to dissect a specimen and observe it, or examine some slides in a microscope, or occasionally follow experimental procedures. However, the course that made the greatest impression on me was Plant Functional Ecology, a field ecology course, which involved bi-weekly visits to different forests and fields set aside for study.

In all cases the experiments and procedures followed in my courses were long-established, traditional methodologies. In each case, a certain result was expected, as undergraduate labs are designed around well known information. And in all cases, unexpected variations occurred. In a dissection laboratory, a fish might unexpectedly have been pregnant, altering the observed anatomy. An ill-prepared slide may be hard to observe. A chemical reaction might not occur to the expected degree. The nature of experimentation, especially when performed by inexperienced and tired undergrads, leads to such incidents, and it's part of the process to examine research after the fact and look at what may have caused the observed results, either as false positives or as unexpected negatives.

However, the main thing I learned in doing field work in the sciences is that there is much more unpredictability and variation in the field. This may seem obvious in retrospect, but until I tried to explain why species abundances along a gradient tract refused to match an expected curve I don't think I really understood the difference between the controlled conditions of a laboratory experiment and the infinite uncontrolled variables out in the field.

This might all seem hard to apply to information research, because knowledge work and libraries are so far removed from the muddy work of field ecology. However, I think the major take away for me about the difference between field and non-field experiments was that in the field, rather than controlling variables we accounted for their uncontrolled nature, or controlled them artificially through sampling.

Another blog entry this week mentioned usability testing as field work, but working from my own experience - both in ecology and in usability testing - I think I would challenge that assertion. Traditionally, usability testing is done in a usability lab under controlled conditions, where a member of a user group is given precise, artificial tasks to complete. Their attempts are observed and their reactions to the process are recorded. This is clearly an experiment intended to get as close as possible to the actual user experience in the field, when the product is "live", but it isn't truly in the field. The user comes to the lab - again, traditionally. In some cases the usability team does in fact relocate to the user's home environment, and that may be close to a field study, but the tasks are still inorganically initiated, and they occur outside of the real framework they would be completed in.

In my research project, I'm focusing on the research approaches taken by undergraduate students exposed for the first time to medieval manuscripts. My study will occur in the field, as the subjects will be students of an actual course and will be approaching their research for the purpose of the course. While I intend to use usability-style post-task questionnaires and procedures as part of my methodology, I wouldn't consider my study one that uses usability methodology because the "tasks" will be much less controlled - they will simply be defined as the research procedures involved in completing the course.

I may be coming off as a bit of a field research snob here, but what I think is important to remember is that a lot of very valuable and important research occurs far, far away from the field. Field research is unpredictable and messy, and useful for exploring new research areas or developing new theories, but without controlled replication and determination of causation nothing can ever come of those theories. Pursuit of knowledge for the sake of knowledge is wonderful, but the real value of research is the ultimate benefits it can have for those outside of academia once it becomes actionable.

Friday, 11 October 2013

Ethics in Studying Undergraduates

While I have previously learned a lot about ethics in research, particularly in my psychology courses, I found the guest speaker on ethics very interesting. I was intrigued by the number of students in the class who hadn't heard of several famous unethical studies, and by Mr. Sharpe's seeming belief that it was a large number of people. This is likely a case where I'm simply taking my own knowledge and background for granted - an interesting problem from a research and ethics perspective where awareness of bias is important in order to avoid unthinking mistakes.

My research topic is focused on information seeking behaviour, which means that my research will unavoidably lead to interaction with people. In order to best study my topic - undergraduate information seeking behaviour and engagement with medieval manuscripts - the most useful approach is likely engaging with undergraduate students in the context of medieval studies. This means that ethical considerations for my research will be considerable. I will have to ensure that my research does not at any point influence the students' studies or achieved grades, either by causing additional stress, interrupting workflow, influencing the instructor or markers, or negatively impacting learning. While it may be beneficial to compare students using digital sources and students using print sources throughout the course, enforcing the use of specific sources when the consequences are unknown is unlikely to be passed by an ethics board.

Ultimately I decided to pursue unstructured interviews and usability style post-task questionnaires in my methodology. While these may pose some risk to students, and a thorough examination by the ethics board is certainly necessary before undertaking the project, this should ultimately be a non-disruptive way to understand the research processes taking place throughout a course unobtrusively.

Beyond my relationship to my inevitable human subjects for this research, the monetization of research is always an ethical consideration. While my topic is not likely a very profitable or industry-influenced one, it is possible that publishers pursuing digitization or encouraging the use of print sources may attempt to influence or guide my research, particularly if I attempt to publish my resulting paper through that company. This is a fairly minor concern, but it is certainly necessary to be aware of the issue.

Friday, 4 October 2013

iSchool influenced research area

I missed my post last week, so I haven't posted a daisy yet and I don't have a graphic for it at the moment. My research interested seems to be focusing on how undergraduate students interact with manuscripts when they first encounter them in education, with a focus on how the increasing preference for and familiarity with full text digital sources impacts their interaction with the media. In order to bring the scope down to a Masters level project I would have to focus more clearly on something - my tentative plan is to look at how undergraduate students seek out primary sources when researching medieval manuscripts, and their thought process throughout the search.

This topic is part of the larger field of information seeking behaviour, and represents a major change from what my original approach to the topic would have been, showing clear influence in my studies at the iSchool. I have been intrigued by manuscripts and the study of medieval texts through manuscripts since my first medieval literature class as an undergraduate, and I've considered research in the area through many different lenses, although I haven't pursued any of them. For a long time my instinctive focus would have been an extension of an undergraduate paper comparing translations of the Canterbury Tales. However, studying at the iSchool has changed how I view my studies a lot, to the degree that my research area is impacted by it.

One of my most distinct memories from studying the Canterbury Tales is that, over the course of a semester, I started developing significant back and shoulder pain from carrying my huge edition of Riverside Chaucer to and from class every day. I actually complained to my professor that he asked us to have the text in class too often when we didn't need it and he apologized. At the time I desperately wished for a digital text but I made no effort to search for one. I remember that near the end of the semester the professor introduced the scanned Hengwrt manuscript to some students, but at the time I had focused on translated texts and it didn't appear relevant.

What strikes me the most about these memories now is not my desire for a lighter text or a digital edition, but that I never put significant effort into finding a digital text despite my desire for one. I attempted a few brief searches on my own, but I didn't put significant effort into the search, or look to my professor or a librarian for assistance. The questions I want to ask now are about how my professor's presentation of the text influenced my commitment to the Riverside edition, how the media of the text influenced my interactions with it and estimations of the likelihood of finding a digital copy, and how my perception of the text and media influenced my search for secondary materials. In short, my analysis of my past experience is now shaded in the tones of information studies.

Monday, 23 September 2013

Methodological Restraints

My background is primarily in English, but I've also studied a Biology, Psychology, and even a bit of Physics. I'm familiar with a lot of the conventions of research in the humanities, sciences, and social sciences. I've thought a lot about interdisciplinarity and mixing methods for research, and I find it fairly easy to relate to Luker's concept of salsa dancing researchers. However, when I sat down to write about my own research ideas, I found myself drawing a complete blank. This is particularly strange to me because I find it very easy to be interested and curious about things. I often come up with questions about the world that I'd love to research and answer, or at least take a step toward answering. But I found that in the frame Luker created, one focused so closely on sociology research, my ideas about research became focused on the sort of topics she was posing. The only questions I could think of were ones related to, say, human sexuality, or online communities. I was hesitant to write them down, though, for two reasons. Firstly, those questions felt highly unrelated to my current studies in libraries and information systems, particularly since I wasn't focusing on information seeking. But secondly, I was primarily thinking of approaching them in fairly rigid, scientific ways that I felt Luker wouldn't approve of. I'm sure that thought in itself is problematic from a free-wheeling, salsa-dancing perspective, but the whole experience was particularly startling to me since, previously, I've been very interested in pursuing research in Chaucerian studies, and because there are so many information science topics I'm interested in. It was very interesting to me that my conception of what made interesting research was so influenced by Luker's perspective in her text - and perhaps that in itself could be an interesting research project. That is, how researchers conceive of their research question, and what influences them to pursue specific methodological avenues.

I think that what I need to keep in mind in the future is that Luker's ideas and advice are useful and interesting, but her perspective is very focused on the world of an academic social scientist. To keep myself open to all of the topics and approaches I find interesting - and useful - I need to remember that research is conducted in many different ways, from the highly controlled and regimented quantitative lab studies in physics and microbiology to the practice-based case studies common in library science.