Helping Google stifle Black Hat SEO

Search Engine Optimization (SEO) seemed to me a strategy that web vendors and spammers used to generate traffic to their sites. As a web user and novice web designer, understanding how Google separates white-hat and black-hat SEO—literally, the good guys from the bad guys—helped me make sense of it a little more.

Google also depends on regular searchers to help by pointing out when something is wrong with your search results. If you do a search, and there is something that doesn’t fit or is sketchy, you can point it out to Google. Kevin Purdy, in his TechRepublic article “Give Google better feedback and get better results” shows us how. 

screenshot of Google Feedback link and popup

Have you ever noticed the “Send feedback” link at the bottom of your search results and thought, “Yeah, no thanks. I’m not writing an email to Google or going to another tab to fill out a form. I just want some better search results, so I’d rather spend my time trying it again.”? As it turns out, it’s Javascript that keeps you right in the page, where you describe in words and then show Google by highlighting what was wrong. And you can go back to your search. Google takes the feedback seriously and uses the user feedback to improve its algorithms. Maybe you don’t see the direct results, but it’s for the “betterment” of the web!

I also have to give credit where credit is due, I learned the most about SEO and what Google does from this article I read in another library school course, LIS 451 (but liked Purdy’s visual view of one of the biggest takeaways I got from the article): Cahill, K., & Chalut, R. (2009). Optimal Results: What Libraries Need to Know About Google and Search Engine Optimization. The Reference Librarian, 50(3), 234-247.

Reference:

Purdy, K. (2012, February 21). Give Google better feedback and bug reports and get better results. TechRepublic. Retrieved December 5, 2014, from http://www.techrepublic.com/blog/google-in-the-enterprise/give-google-better-feedback-and-bug-reports-and-get-better-results/ 

Instructional librarians and a culture of professional development

In my course discussion this week about professional development for information and technology literacy, one of my classmates suggested the importance of maintaining the funding for professional development. She spoke of how much she appreciated the opportunity to attend conferences in her current position and how it sounds idealistic, especially in the public sector, but that she felt it was important anyway.

Just because you spend money on something doesn’t mean that you will get results, BUT dedicating money to something (or not) in a budget DOES show your commitment to that thing. I think my classmate’s suggestion about keeping money available for professional development is a wise one.

Let me offer an example of the consequences:

Due to some of the budget side-effects of Act 10, my school district slashed this kind of professional development funding and it seems we are often told, “No, you may not attend X-conference; we don’t have the budget.” Additionally, the Tech Department has been allocated less money and seems to mainly operate as firefighters, trying to put out “fires” with the network and hardware to keep us running, and they too can’t really devote the resources to serve as instructional leaders either. Which leaves the librarians… As far as I can tell, with the upper levels of support stripped back, it is these professionals left at the front lines to help–and they are rockstars! Unfortunately, it’s probably only at the one-on-one, ad-hoc level, instead of systemic, intentional training. Every time I reach out for their assistance though, I’m met with, “What can I help you with?”

While it is possible to be grass-roots and low-budget, money greases the wheels!

From my humble point of view (because I’m not gong to tell anyone how to do their job), to insure that staff participate in long-term, on-going professional development for info/tech literacy, it’s the LEADERSHIP (principals, administrators, directors) who also must believe in the value of developing their staff. The instructional librarian can advocate for this kind of training on their own, but the leadership has to give them the time of day. (The librarian will probably have to sell it to their administration first.)

To be an instructional leader like this, the librarian has be a good communicator, able to read timing, body language and institutional culture. For example, if the librarian’s position has been viewed as lowly, then the librarian will probably be better off treading lightly and not coming in with a tour de force (lest they want to risk their employment status). Instead of shoving change down people’s throats or condescending them by sharing an unsolicited criticism of what’s wrong with the system, working with people to solve their problems will probably earn more respect.

The other thing I would advise would be to ask questions at all levels and try to find the holes in info lit/tech skills and the perceived needs of the building. This way the librarian can find a place to fit their expertise and leadership and potentially come in and save the day. It takes time!

The best advice I was ever given in my career (and I didn’t know it then, I had to learn the hard way), was that “Changing a school (we could say library) is like turning an ocean liner.” There are a lot of moving parts and things are bigger than just you. Plus, we don’t need to capsize the whole ship.

Annotation: Online Tutorials

Bautista Sparks, O.  (2010). Five minute screencasts — The super tool for science and engineering librarians. Science and Technology Librarianship, 60. doi: 10.5062/F4JH3J4S. http://www.istl.org/10-winter/tips.html

This article explores the use of screencasting as a tool for librarians to create online tutorials. Several examples of video screencasts for instructional purposes are featured, such as orientations, reference consultations, class instruction and virtual library workshops. The author wrote this article for a science and engineering librarian audience, but her tips are applicable to most instructional librarians. She offers a section discussing the different features of screencasting tools in order to assist librarians in choosing a tool. There is also a table comparing four common free tools. Because this article was published in 2010, these comparisons and features may already be outdated or inaccurate. Her tips for creating screencasts are brief, primarily discussing the logistics of how you might set up the content you want to record. She does, however, reference several other sources with instructions and checklists, though these are even older—from 2009. I suspect that such guidelines for creating a user-friendly experience do not expire as quickly as other digital trends.

A critique of online and live library workshops

Part 1: CLUE (online library orientation)

UW Libraries’ Campus Library User Education Tutorial (CLUE) presented a succinct, yet complete overview of beginning research strategies that freshman Communication Arts students would need. It was especially sage to predict and address the common misconception among teenagers that they can “just look it up on Google,” as shown in Module 2: College Level Research. The librarians do a good job here selling why library services are valuable to academic research and explaining how college-level research could be different from what students may have experienced before.

It is nice that each module in the tutorial is divided into 2-6 minute videos. Since the videos include embedded, interactive quizzes, students can be held accountable for the information if they are required to use CLUE as part of a course. This is also good pedagogy because it forces students to reconsider the most important parts of each module, aiding in comprehension and retention. Because the quiz questions are partially dispersed throughout the video, students are more likely to stay engaged and not lose focus by the end.

An improvement I might suggest would be providing access to an outline or transcript of the modules’ content once a student has successfully completed a quiz as a take-away tool. Requiring students to print their successful quiz results seems a little low-tech, given that the libraries obviously use an advanced screencasting application, Adobe Captivate, to create these interactive videos. Also, sometimes students do not have handy access to printers, making the print certificate requirement cumbersome and/or a barrier to success. We have to be sensitive to the digital divide with hardware access. Perhaps a solution would be to give students the option to click a button to share their results digitally with their instructor.

While I found the content of this tutorial to be very useful and wish that my undergraduate training had included something similar, it would have been a good idea to include a practice database search among the modules, even if the search was completely optional. The quizzes provide comprehension checks, but do not guide students to apply their new skills. It is likely that they soon will be asked to do such in their courses, but offering additional practice in this content might be welcome practice for some. Obviously, database content and search results change regularly, so it would be difficult to verify students’ work or provide something for them to compare to (unlike the controlled responses of the quizzes).

Part 2: A live library workshop

On November 4, 2013, I attended a workshop at a library on UW campus about ACT 31 Resources, which is a law requiring teachers to include instruction on Native American culture, customs and history in Wisconsin. The workshop was led by one of the library’s graduate teaching assistants and a former advisor for the American Indian Studies program.

The setting was relatively informal, since the workshop did not require prior registration and there ended up being a small number of attendees. The presenters arranged the chairs in three or four rungs of a semi-circle, with a notecard sitting on each chair. There were also laptops set up on side tables for participants to complete a short Google forms survey to provide feedback to the presenters at the end.

The primary difficulty during the workshop was a technology failure. The computer that was hooked up to the projector had an unreliable Internet connection and seemed to be struggling to respond. The library staff decided that the machine probably needed to be imaged, but did not scramble to replace the technology by substituting the setup with a laptop. The presenter tried to lead the discussion without the slideshow while the computer caught up, but at one point we had to watch a video from the screen (and speakers) of a MacBook set on a chair.

It is smart to have a backup plan in case technology fails you when you are teaching, but in this case, I would have expected the library to had have anticipated problems with the machine that needed to be updated. The majority of the resources were digital, so low-tech was not a good option. There were lots of handouts available to take away, including articles and teaching strategies, but I might have appreciated a printed list of these ideas compiled on one sheet.

The presenter offered a reward to an attendee who answered her first question, which is a nice way to encourage participation, but it was the only reward offered, which was a little disappointing. (Who doesn’t like free stuff, after all?) We were also guided through an activity idea called “Descriptive Art” as a way that teachers could share Native American art and culture with students in a respectful way. I really enjoyed the interactivity and practical application of this activity.

It was also nice that there were visual, multimedia and discussion aspects to this workshop. I left the library feeling curious and impassioned to learn more about native cultures. I had my laptop with me so that I could take a look at some of the websites the presenters were referring to, and I imagine that it would have been valuable to the other attendees to do the same. The workshop was only scheduled for one hour and the presenters were very sensitive to this by starting and ending on time. They could have included a segment where participants could spend some time hands-on with the resources on computers and talking about teaching ideas, but there simply was not enough time. Perhaps 90 minutes would have been more realistic to cover the resources they were sharing.

(P.S. I didn’t mean this part to sound negative, like I said–I was really enthusiastic afterwards about the content. I learned SO much! I was just making observations and trying to think of ways to troubleshoot some of the glitches.)

Annotation: Badges for Higher-Ed Assessment

Buell, C. (2013, August 30). Using Badges to Quantify Learning Outcomes at UC Davis. Edcetera. Retrieved October 20, 2013, from http://edcetera.rafter.com/using-badges-to-quantify-learning-outcomes-at-uc-davis/

This article examines the use of badges to measure learning outcomes in higher education, especially as developed by UC Davis and Joanna Normoyle, who won an award for the innovation at the Digital Media and Learning Competition. The idea is to quantify and standardize higher-level thinking skills gained throughout the course of a university degree and award a digital badge for the achievement, potentially making them useful to future employers trying to determine the skill-set of a candidate or even simplifying the process of credit transfer between universities looking for equivalent coursework. Badges in such a system also can be useful for helping learners to track their progress and customize an academic program. UC Davis is officially launching their program with students this fall. By extension, I could see this as an easy, practical way for academic libraries to jump in, partner with departments and get involved in assessing and communicating the information literacy of students.

Setting assessment policies in the syllabus

Part two of the syllabus analysis I began earlier this week…

My sample syllabus (the culinary arts one) addressed assessment as follows:

Assessment Strategies Used: cooking labs in the kitchen, quizzes, tests, homework, and projects.

The instructor neglects to explain if they are using a total points system or are weighting grades based on category. There appear to be 16 course standards, divided into 2-3/unit, sometimes repeating in later units, over 12 units. I suspect that course grades are simply based on competency for each course standard, with each course standard being worth the same value in the final grade.

For example, for this course, Standard 1 is, “Demonstration of proper cooking techniques that result in a quality end product while employing safe and sanitary methods.” Standard 8 is, “Knowledge of food service equipment, including identification and use, mise en place, knife skills, and seasoning.” I am speculating here, but students might be asked to do assignments, tests, quizzes, etc. that give the instructor a piece of evidence/an article that reflects their learning as a competency for each standard–but that it would be at the instructor’s discretion, not pre-announced. The idea is to get kids to focus on learning, not just the acquisition of points.

The syllabus doesn’t actually confirm this thought though. It’s hard to say if it would be a fair assessment of student learning or if it’s qualitative/quantitative assessment without more information on how the gradebook is set up.

To me, removing the points-game is really important. The focus should be on learning, not jumping through hoops. And grades should reflect said learning, not ability to play the game. On the other hand, when students know how they are being held accountable, they have more ownership, which encourages the intrinsic motivation that we want learners to have and leads to real learning!

A look at the instructional design of a syllabus

This week, I chose to look at the instructional design of a syllabus from a high school culinary arts course. (Isn’t it cool that they offer this? It’s a year-long, two-period course.)

First of all, it is worth saying that high school syllabi are typically MUCH different from what people are used to seeing at the university level. Any teacher who is organized enough to lay out course activities/objectives for 180 days is 1) insane and 2) not as good of a teacher as they claim to be because they don’t leave flexibility for responding to formative assessments–like when kids need to be retaught something or to be taught in a different way. That said, I did once work with a middle school science teacher who had her outfits laid out from Christmas through Spring Break. She was 1) hilarious and 2) amazing, so I wouldn’t put it past her to pull off what I just declared impossible. She may have evolved since the outfit planning days. (I sort of hope she sees my shout-out to her here.)

While I appreciate having all of the information up front in a university course, where I’m planning out work schedules and other class demands, in a high school, I think it’s probably not the greatest idea. Kids aren’t usually comforted by information overload; they are stressed. So, if you want to deal with perpetually fielding multiple, individual complaints of “I don’t get it,” try giving all of the information at once. This goes for scaffolding a project just as much as a detailed syllabus. Also, if you want kids to be responsible for the information in the syllabus, you have to go over it with them, and when they sit through that 6-7 other times during the first week of a course, short and sweet is usually good advice. The five pages in my sample syllabus is pushing it, though the “meat” really consists of two pages.

The syllabus I looked at was divided into these sections:

  • course description
  • texts and resources
  • behavior/discipline
  • class units for 1st and 2nd semester (12 in all)
  • standards-based instruction explanation
  • assessment strategies
  • field trip policies
  • contact info

Our school is big on standards-based reporting, assessment and instruction. This seems like a no-brainer, but you’d be surprised how RARE true standards-based reporting and assessment is after elementary school. Elementary schools actually do. Think about your report cards with different skills listed and all the Es, VGs, Ss and Us (or whatever your school called them)–this is standards-based reporting.

How do I know that secondary ed (and higher ed) don’t really do this? Because parents and students continue to be fixated on their percent and letter grade. Also the computerized grading tools out there (PowerSchool, Skyward, Infinite Campus, etc.) don’t really support standards-based reporting yet–though they are very good of showing a list of assignments with a percentage and letter grade. I also see a lot of tests in my role as a support teacher. There are also a lot of teachers out there still giving tests with one grade at the end–not several grades broken down by the standards-based skills.

That said, I believe that our school is ahead of the curve when it comes to standards-based assessment (though standards-based instruction is different–we are still a work in progress there). What I liked in this syllabus is that the standards were listed in the description for each unit, instead of a large list for the whole course (which is sort of what I do :/ for my own syllabi). Raising awareness to the foci of each unit as the unit begins is a great way to plan instruction, because students can reflection on their learning process (metacognition!)

The section on assessment strategies, though brief, was good too, since students want to know how they will be graded and held accountable. The contents of the explanation are pretty predictable, but sometimes students might be surprised by and hostile toward the appearance of an oral presentation or something if they haven’t been warned.

The discipline section is also very important for a high school course. This one refers to the school and district guidelines in the Student Handbook. While this is a sound and clear-cut approach, I think it might also be useful to reference some of the kitchen-specific safety rules in effect in this classroom. However, perhaps the signage in the room and orientation by the teacher to the facility is sufficient.

I already had a lot of respect for the caliber of instructor and the culinary program at this school and was pleasantly surprised by the design of the syllabus and course. If I had to give a “grade,” I’d say that this one’s an A, Exceeds Expectations!

Annotation: Instructional Design

Hovious, A. (2013, September 22). The “Rule of One” and the One-Shot Session. Designer Librarian. Retrieved October 13, 2013, from http://designerlibrarian.wordpress.com/2013/09/22/the-rule-of-one-and-the-one-shot-session/

This blogger-librarian discusses how librarians often only have one opportunity to reach students in an information literacy training session, requiring the librarian to seriously consider instructional design if he/she wishes the instruction to be effective. The author lays out four guidelines for librarians trying to design a successful stand-alone session: one learning goal, one objective per task, one strategy per objective and one culminating activity. This is an important idea since many students have limited contact with librarians and may not consult with one unless they are required to—the first impression matters for possible future reference desk use, but also for the information literacy skills that are developed/under-developed in the students that the library serves. Even if a teacher-librarian has the opportunity to adjust his/her instruction based on formative assessment or student needs-analysis, the guidelines in this article are worth implementing because they make for a tight, power-packed lesson.

A new librarian’s collaborative dilemma

I’ve learned that it’s usually much easier to walk into a mess than to be the next act following a rockstar. (Sometimes, this also applies to dating.) Any improvements you make will typically be well-received. However, sometimes people lower their expectations and get used to your role as a non-effective one.

This week in my information literacy course, we were faced with cases of hypothetical librarians, struggling to create collaborative relationships in their libraries. The hypothetical middle school librarian approached a seemingly friendly colleague and offered to work with him on a research project to integrate some information literacy skills. He shot her down and questioned her ability to help him with social studies. The librarian’s predecessor probably never worked like this with teachers and they probably were pretty used to taking care of themselves. I would also suspect that the former librarian didn’t play well with others in general. When faced with a disheartening rejection like this one, it is pretty tempting to back off. She could try and re-phrase her offer, possibly starting small by offering a simple mini-lesson with his class about using a database in the computer lab that he feels more comfortable in. He may not bite though, since he has already blown her off.

Another thing to try is to simply offer her services to another teacher (and if she stuck with the Social Studies department, the endorsement would be more likely to sell her first rejector on it later.) Sometimes, teachers are grumpy or have a hidden grudge that you might not be able to predict. A silly example, but last year, I tried to organize a moral-boosting lunch treat in my building, hosted by teachers with March birthdays. I didn’t get a lot of response after my email, so I decided to check in personally with the silent parties before ditching the idea. I checked in with Mrs. B and she shot me down so cruelly that I walked out of there with a trembling lip. (I mean, really, asking her to bring in a bag of shredded cheese apparently was out-of-line. But she didn’t have time to have lunch, she said.) I almost gave up, but checked in with another science teacher next door to her whose response was, “Yes! What do you want me to bring? How can I help?”

My point is, you just never know “who’s in” or “who’s out.” Baby steps. Building a culture isn’t always easy.

My suggestion for the school librarian’s plan of action:

  1. Make a menu of quick mini-lesson or push-in instructional ideas that teachers could use her for… email it out and make some small cardstock/ laminated bookmarks/magnets that she could stuff in teacher mailboxes so they’ll have it around and think of her sometime. She’ll have to start small to build a culture.
  2. Try again with the nay-saying social studies teacher, but don’t expect him to bite until she has a track record. Approach other teachers in the social studies department personally with the above mentioned menu of services.
  3. Try attending a middle school team meeting a few times and just listen. She might get some ideas on what teachers are struggling with and find ways to help. Showing up regularly would build trust and credibility.

Annotation: Collaboration

Immroth, B., & Lukenbill, W. (2007). Teacher-School Library Media Specialists Collaboration through Social Marketing Strategies: An Information Behavior Study. School Library Media Research, 101-16. Retrieved October 6, 2013, from http://www.ala.org/aasl/aaslpubsandjournals/slmrb/slmrcontents/volume10/immroth_teacherslmscollaboration

This study examined how social marketing strategies can be a tool for fostering collaborative relationships in schools between teacher-librarians, content teachers and student librarians (i.e. practicum school librarian graduate students). The study is thoroughly documented and reflects its validity as exploratory research, but I was most interested in the strategies the librarians tried. By using the concept of social marketing, which aims to benefit the audience and society instead of the marketer, and the Attention Interest Desire Action (AIDA) model, the researchers were able to give a common framework for their participants to pursue this type of collaboration. Though written as an academic research article, its application of the AIDA model could be potentially more useful to practitioners seeking to improve their collaborative efforts than a simple list of tips—like eating lunch with other teachers in the teacher’s lounge or sharing names of new DVDs with targeted teachers.

figuring yourself out

This week, I was asked to complete an online survey about my learning style. I didn’t really have a learning style in mind, though I would have guessed that I’m also not a kinesthetic learner (yes, that was confirmed). I was on the fence a little between the other results–within one point of each other. I suspect that while I was an auditory learner one day, I may be more of a visual learner another time. For example, I take notes sometimes because it helps me process what I’m hearing. What kind of learning is that? Visual? Kinesthetic? Who knows?

I also scored myself on a sort of Myers-Brings test (from a chapter in the book A Teacher’s Guide to Cognitive Type Theory and Learning Style by Carolyn Mamchur) and got another surprising (or maybe not so surprising) result. Apparently, I am an ISFJ (Introvert Sensing Feeling Judging) now, but for example, when I had the real-deal copyrighted official paper Myers-Briggs questionnaire in front of me for some kind of high school leadership training when I was 17, I was an ESTJ (Extrovert Sensing Thinking Judging). I was talking to a psychology-type about it yesterday, and we discussed how this could be. Basically, there were a lot of “close” scores: 4 to 3, 6 to 1, 4 to 3, 4 to 3.

I agree now that I am definitely not an extrovert, and probably really wasn’t in high school either, but you convince yourself that this is the valuable personality trait to have, especially since teenagers are supposed to be all about the social. The Feeling result was also new to me this round–I’ve never had it “win” when I’ve tried online Myers-Briggs-like tests. However, I have NEVER wavered on the Sensing or the Judging: I’m not the Intuitive nor Perceiving kind.

Technically, it’s probably better to call me an iSfJ or even iSfj (I would call the J stronger than this one says though, just because I’ve always score that as a J.) When I read Mamchur’s descriptions, ISFJ does fit (especially given how my life/career have been lately), though so does ISTJ, which is what I kind of expected in the first place.

Mamchur's descriptions of ISTJ and ISFJ

Mamchur’s descriptions of ISTJ and ISFJ (1996)

All in all, I’ve always liked the Myers-Brigg as a measure because it’s detailed and pretty fun–but it’s also kind of like a horoscope. Should we really be making life decisions based on it?

I think we could probably do a little better with learning/personality typing apparatuses than the USD learning style survey or the Mamchur quiz. They were quick, yes, but I think a little over-generalized too. A colleague of mine does learning inventories with her students and says she uses it to plan instruction (this is a tall order… but it’s good to think about, even if only sometimes!). She uses some device with the 7 or 8 multiple intelligences and she has the kids chart their individual results on one of those graphs that looks like a spider web (and then they all hang it on the wall).

Provided that we were to find a really strong quiz device, I think it’s valuable strategy to encourage learners to be self-aware. Know thyself, right? It would be great if instructors paid attention to it more, but I just don’t think they do it consistently. I probably don’t. I mostly try to design things that are interactive and involve tasks. If it addresses multiple learning styles or intelligences, it’s probably a lucky side-effect.

learning retention rates

learning retention rates

I think awareness of brain-research on learning is generally the most important: knowing people’s attention spans and retention rates (like in that pyramid where people only retain 5% of material from a lecture, etc.) I think gearing library teaching tasks to these kinds of ideas is the way to go (instead of setting something to music…

That said, I do play classical and mood music when my students are silent reading or working independently to keep them tune out conversations and other distractions. It works!