You are Browsing the Category communication

Regina Folk Festival Piloting Accessibility Project Using Hearing Loop

“I think this will have a wider-reaching impact than it was probably intended for. That’s very cool.” Author of the article:Creeden Martell
Publishing date:Jul 18, 2022

The Regina Folk Festival (RFF) is piloting a project at this year’s event with the aim of improving accessibility for the hearing impaired at future shows with the use of a Hearing Loop.

The loop is a copper wire, which is inserted into the ground and could be used in a wide variety of environments like meeting rooms, theatres or, in this case, a music festival.

It converts audio to a magnetic signal, which is transmitted to a person’s hearing aid, implant or headset and those within the loop can utilize its capabilities to hear the entertainment better by eliminating background noise.

“Then you’re able to use your hearing aids or implants to tune into the sound right off the sound board,” Josh Haugerud, executive director of the RFF, said in an interview Monday. “You can basically adjust the sound so that if you have hearing issues or if you have sound sensitivity or if it just sounds muddled to you, you can make it sound the way you want it to sound and be more comfortable.”

The loop was purchased with the help of a grant from the City of Regina, which was just under $8,000, according to Haugerud.

There are 4.6 million adults in Canada who have experienced hearing loss in speech frequency to some extent, according to Statistics Canada. People who also have hearing sensitivity could use the loop, the RFF noted in a news release.

The pilot will have the loop in a designated area near the soundboard to determine its capabilities. If there is no hearing aid to tune into the sound, there is a headset option. Haugerud said the RFF wants to make sure it’s working 100 per cent before it starts charging people in future events.

The RFF has given out passes to the upcoming festival to organizations in the city, such as Saskatchewan Deaf and Hard of Hearing Services (SDHHS) and Regina’s Autism Resource Centre (ARC), which will give the passes out to those who can use the loop and provide feedback.

“It’s really cool to actually have like a double impact where it can help those who are deaf or hard of hearing, but it could also help those who might experience sensory overwhelm if it’s too loud, too busy, too crowded,” said Diandra Nicolson, employment coordinator at ARC.

The organization received four weekend passes from the RFF, which will be available to those interested.

Nicolson said neurodivergent people, which includes people with autism, ADHD or other sensory-processing disorders, could utilize the loop by blocking out all the extra noise and hear the music, which is why people go to the Folk Festival.

Nicolson, who has ADHD, said she wishes there had been similar technology at other festivals she has attended, such as a show in Winnipeg earlier this summer.

“It was super overwhelming and loud. When there’s conversations going on all around you, it’s hard to focus on the music and actually enjoy it,” Nicolson said. “I think this will have a wider-reaching impact than it was probably intended for. That’s very cool.”

She said the loop could have the same effect as a cut curb, which references accessibility features and designs being utilized and appreciated by more people than just those who require them, like someone with a physical disability.

“I think that this loop pilot project will have the same effect, that it might benefit people who are deaf and hard of hearing first and foremost, but then it could also help other neurodivergent people –
people who are autistic, ADHD or people who just get overstimulated in loud, crowded events, I think it could just help make the festival more accessible to many, many people.”

crmartell@postmedia.com

Original at https://leaderpost.com/news/local-news/regina-folk-festival-piloting-accessibility-project-using-hearing-loop

Litigation Against New York State Board of Elections Resolved

New York to Create Statewide Accessible Absentee Ballot Program New York, NY
April 5, 2022

The National Federation of the Blind of New York State, American Council of the Blind of New York, Inc., Center for Independence of the Disabled, New York, Disability Rights New York, and several New York voters with disabilities, including Rasheta Bunting, Karen Gourgey, Keith Gurgui, and Jose Hernandez, have settled the Americans with Disabilities Act (ADA) lawsuit they brought against the New York State Board of Elections (NYSBOE) in 2020.

A federal court has approved and ordered the terms of the settlement agreement, under which NYSBOE will create a statewide program allowing blind and disabled voters to fill out a remote, accessible vote-by-mail ballot online, print it out, and mail or return it to their county board of elections.

The settlement requires NYSBOE to choose a remote accessible vote-by-mail (RAVBM) system that allows blind people and people with print disabilities to use their own computers to read and mark a ballot, using their own screen-reader software that converts the ballot content into spoken words or into Braille displayed on a connected device. NYSBOE must also create a statewide portal that voters can use to request an accessible absentee ballot and train each of the fifty-eight county boards of elections on the use of the RAVBM system. The county boards of elections will provide return envelopes for the ballots, just as they do for paper absentee ballots. The inside oath envelope into which the ballot is to be placed must have a tactile marking indicating where it is to be signed, and NYSBOE will instruct county boards of elections to accept a signature anywhere on the envelope. County boards will also be required to help voters who do not have their own printers to facilitate the printing of their ballots. NYSBOE will also pay attorney’s fees and costs of $400,000.

Plaintiffs were represented by Disability Rights New York, Disability Rights Advocates, and Brown Goldstein & Levy, LLP.

“DRNY is pleased that the absentee voting program is now more accessible. Through this agreement, the New York State Board of Elections has made it easier for people with print disabilities to vote with greater privacy and independence. All voters need access to vote in order to have their voice heard in their local, state and federal elections,” said Timothy A. Clune, Executive Director, Disability Rights New York.

“The National Federation of the Blind supports the right of blind, deafblind, and other disabled voters to mark their ballots privately and independently so that their right to a secret ballot is protected,” said Mike Robinson, President of the National Federation of the Blind of New York State. “We are pleased that the state of New York is taking these steps so that its blind and deafblind voters can exercise this constitutional right, which is fundamental to participation in our democracy.”

“This ruling affirms that the right to vote is something that all people regardless of disability status should be able to fully exercise as their civil duty. It will provide absentee voters access via an electronic platform for people with a myriad of disabilities, including but not limited to, visual, learning, and physical,” said Sharon McLennon-Wier, Ph.D., MSEd., CRC, LMHC, Executive Director of the Center for Independence of the Disabled, New York. “This is a great step forward in making sure that New Yorkers with disabilities are able to successfully cast their votes.”

Karen Blachowicz, President of the American Council of the Blind, New York said, “we’re pleased that the state will provide consistent accessible absentee voting methods and supervision of every county board, so that every blind voter in New York can be confident of an accessible absentee vote.”

“We will continue to fight for the fundamental right of people with disabilities to vote privately and independently,” said Christina Brandt-Young, Supervising Attorney at Disability Rights Advocates.

About American Council of the Blind (ACB):

The American Council of the Blind is a national grassroots consumer organization representing Americans who are blind and visually impaired. With 70 affiliates, ACB strives to increase the independence, security, equality of opportunity, and to improve quality of life for all blind and visually impaired people. Learn more by visiting www.acb.org.

About Center for Independence of the Disabled, NY (CIDNY):

The Center for Independence of the Disabled, NY’s goal is to ensure full integration, independence, and equal opportunity for all people with disabilities by removing barriers to the social, economic, cultural, and civic life of the community. Learn more about our work at www.cidny.org.

About Disability Rights Advocates (DRA):

Disability Rights Advocates is a leading national nonprofit disability rights legal center. Its mission is to advance equal rights and opportunity for people with all types of disabilities nationwide. DRA has a long history of enforcing the rights of voters with disabilities, including their rights to accessible voting machines, polling places, and online voter registration. Visit www.dralegal.org.

About Disability Rights New York (DRNY):

DRNY is the designated independent non-profit Protection & Advocacy System empowered by Congress to investigate allegations of abuse and neglect and provide legal and non-legal advocacy services to people with disabilities in New York State. The Protection & Advocacy System was created by Congress as a direct result of the horrific conditions that were uncovered in the 1970’s at New York’s Willowbrook State School. DRNY is supported at tax payer expense by the U.S. Department of Health & Human Services, The Administration for Community Living; Center for Mental Health Services, Substance Abuse & Mental Health Services Administration; U.S. Department of Education, Rehabilitation Services Administration; and, the Social Security Administration. This press release does not represent the views, positions or policies of, or the endorsements by, any of these federal agencies. Visit www.drny.org.

About National Federation of the Blind (NFB):

The National Federation of the Blind, headquartered in Baltimore, defends the rights of blind people of all ages and provides information and support to families with blind children, older Americans who are losing vision, and more. Founded in 1940, the NFB is the transformative membership and advocacy organization of blind Americans with affiliates, chapters, and divisions in the fifty states, Washington DC, and Puerto Rico. We believe in the hopes and dreams of blind people and work together to transform them into reality. Learn more about our many programs and initiatives. Learn more about our many programs and initiatives at https://www.nfb.org.

Contacts

  • Disability Rights Advocates: Christina Brandt-Young
    cbrandt-young@dralegal.org
    212-644-8644
  • American Council of the Blind: Clark Rachfal
    crachfal@acb.org
    202-467-5081
  • Center for Independence of the Disabled, New York: Jeff Peters jpeters@cidny.org
    646-442-4154
  • National Federation of the Blind: Chris Danielsen
    cdanielsen@nfb.org
    410-262-1281
  • Disability Rights New York: Katrin Haldeman
    katrin.haldeman@drny.org
    518-512-4929
  • Brown Goldstein & Levy: Eve Hill
    ehill@browngold.com
    202-802-0925

Original at https://dralegal.org/press/ny-absentee-voting-settlement/

Sony’s New Lip-Reading Technology Could Boost Accessibility – or Invade Privacy

Sony’s Visual Speech Enablement uses cameras and AI to read lips from far distances, no matter the noise. But are its accessibility benefits overshadowed by the potential for privacy violations? By Steven Winkelman
Updated January 13, 2021

Facial-recognition software can identify faces in a crowd, but how about picking up conversations without the help of nearby microphones? Sony’s Visual Speech Enablement does just that, using camera sensors and AI for augmented lip reading in any environment.

Mark Hanson, Sony’s VP of Product Technology and Innovation, gave a limited overview of the technology during a CES keynote. It’s a new use case for Sony’s Intelligent Vision Image Sensor and uses AI to isolate a user’s lips and then translates their movements into words, independent of any background or foreground noise. In fact, it requires no microphone whatsoever. The distance between the sensor and user is almost inconsequential and it can work over many feet, simply by using a higher-resolution sensor, Hanson told us last week.

Sony initially plans to market the technology for a handful of use cases, such as factory automation, kiosks, and voice-enabled ATMs. Visual Speech Enablement is optimized for use on computers, though consumer-facing versions of the feature could roll out on mobile hardware in the future, according to Hanson.

When asked about the potential for assistive uses of this technology -such as improving auto-generated captions, or reducing the need for a relay operator or automated speech-recognition intermediary that requires a solid data connection and minimal background noise- Hanson said the software is not optimized for such use cases yet but could be in the future.

For all of Visual Speech Enablement’s potential for good, there’s also the possibility it could be misused. Hanson says the technology only captures lips, not faces, so no user-identifiable data is retained. What remains unaddressed is the possibility of combining Visual Speech Enablement with other technologies, many of which use cameras and could incorporate Sony’s AI-enhanced sensors. If Visual Speech Enablement were to sit alongside a facial-recognition camera, the data could be aggregated and undo Sony’s built-in privacy protections.

Few mediums remain truly private, of course. Websites track you via cookies; some ISPs and mobile carriers sell your data. Despite crackdowns in some cities and states, facial-recognition technology is already in use on streets and in stores. Time will tell where something like Visual Speech Enablement fits in.

Editors’ Note: This story has been updated to correctly reflect statements from Sony’s spokesperson.

Original at https://www.pcmag.com/news/sonys-new-lip-reading-technology-could-boost-accessibility-or-invade-privacy

Bad Braille Plagues Buildings Across U.S., CBS News Radio Investigation Finds

By Steve Dorsey
June 28, 2019

The federal government, corporations, cities and even medical facilities across the country are looking past the needs of blind Americans by failing to address problems with braille signage.

CBS News has uncovered complaints to the Justice Department’s Disability Rights section about missing or incorrect braille at a number of public facilities, including Albuquerque’s bus system, restaurants in Kansas and Pennsylvania, and hospital and medical buildings in Chicago, among other locations. The records, spanning two years, were obtained through a Freedom of Information Act request.

Forty-one-year-old Vencer Cotton, who’s been blind since birth, often encounters bad braille in Washington, D.C. Cotton says he once entered the wrong restroom because of it.

“I swing open the door, I dive in, and I get that screaming group of ladies in a haste to put me out,” Cotton said. “And that was simply because the sign … said ‘Men’ in the braille.”

Accompanying a CBS News journalist, Cotton found incorrect and missing braille at a branch of the D.C. Public Library, which had a notable lack of braille signage and no labeling of audio books, which are a common way of reading for the blind.

At the Franklin Delano Roosevelt Memorial, the braille was too oversized to read for the blind. When asked about this, the National Parks Service told CBS News that the braille on the memorial was “part of the artist’s design of the memorial,” and was “not necessarily intended as accessibility elements” for the blind.

City Hall at the Wilson Building in the District of Columbia featured labels that were overly generic — for instance, labelling a set of stairs as “stairs,” rather than identifying the location of the building, like “northwest stairs” or “stairs 1.” A bathroom sign for the men’s restroom, Cotton said, was just nonsense.

“Having a correctly brailled exit sign could mean the difference between life and death,” he said.

Disability rights attorney and author Lainey Feingold says federal and state accessibility requirements are often ignored.

“Sadly, compliance with federal and state laws and regulations often don’t happen,” she said. “There isn’t the attention to detail around accessibility as there is to other issues like security and privacy when really it is all the same.”

A spokesman for the U.S. government’s watchdog over accessibility design standards at federally-funded buildings–the U.S. Access Board–says older facilities and monuments are often excluded from disability guidelines. And the USAB says it has no estimate of how many federally-funded buildings are complying with disability access laws.

But almost 30 years since the Americans with Disabilities Act became law, many visually impaired people like Cotton say the availability of accurate braille is still falling short.

Jake Rosen contributed to this report

Original at https://www.cbsnews.com/news/bad-braille-plagues-buildings-across-u-s-cbs-news-radio-investigation-finds/

How Does the Refreshed Section 508 Rule Affect Your Agency?

By Mark Gross
Aug 21, 2017

A part of The Americans with Disabilities Act and the Rehabilitation Act of 1973, Section 508 requires that government agencies provide individuals with disabilities equal access to their programs, services and activities. Specifically, Section 508 deals with electronic and services, including web page content, PDF documents and audio and video content and specifies requirements to ensure that all web content is accessible to people with disabilities. The latest update to Section 508, known as “the refresh,” went into effect March 21, 2017.

The number of people affected by Section 508 is significant. As much of the 10 percent of the population may be affected by blindness, low vision, learning disabilities or other difficulties that impair their abilities to access information. Section 508 mandates that federal agencies allow easier navigation, accessibility and readability through, for example, navigation aids embedded into documents that allow software to “read” materials in the right order, textual descriptions of images that the computer can “read” to people who cannot see them and adjustments to fonts and colors that make materials easier to read.

While the act is specific to federal agencies, its effect is much broader, as local and state governments, educational organizations and even private corporations often incorporate the requirements by reference — some because of legal requirements, some because it’s a good business practice.

This article focuses on what changed in the refresh and how the refresh affects agency systems. The good news is that if agencies complied with the original Section 508 rule, then they are ahead of the game concerning the refreshed rule. Simply put agencies that were compliant are still compliant because there is a “safe harbor” clause embedded in the new rule that exempts existing or “legacy” IT from having to meet the refreshed rule. Keep in mind, though, that new or updated web pages created after the new rule went into effect must comply with the new rule by January of 2018.

While the discussion below enumerates a number of important changes that the refresh incorporates, the more important point is that the law helps set the tone for what the federal government sees as a baseline for accessibility policies, and it provides a renewed focus on the importance of making electronic content accessible to people with disabilities.

The following summarizes major changes as categorized in the Access Board guidelines:

Restructuring provisions by functionality instead of product type to keep better pace with the increasingly multifunctional capabilities of technology. The original standards used a product-by-product approach, mandating specific features to support common assistive technologies and specific use cases. For example, in 1998 mobile devices were rare and not addressed. The new standards take a functionality-based approach that will better keep pace with technological advances.

Incorporating the Web Content Accessibility Guidelines (WCAG) 2.0. The final rule incorporates by reference a number of internationally accepted voluntary consensus standards, including WCAG 2.0. Issued by the W3C’s Web Accessibility Initiative. WCAG 2.0 is a globally recognized, technology-neutral standard for web content. The final rule applies WCAG 2.0 not only to web-based information but to all electronic content.

The benefits of incorporating the WCAG 2.0 into the Section 508 standards are significant. A substantial amount of WCAG 2.0 support material is available, WCAG 2.0-compliant accessibility features are already built into many products, and it promotes international harmonization as it is referenced by the European Commission, Canada, Australia, New Zealand, Japan, Germany and France.

Specifying types of non-public facing electronic content that must comply. In addition to public-facing materials, the standard also specifies non-public electronic content used by a federal agency for official business to communicate other important information, such as emergency notifications, initial or final decisions adjudicating administrative claims or proceedings, internal or external program or policy announcements, notices of benefits, program eligibility, employment opportunities or personnel actions, formal acknowledgements or receipts, questionnaires or surveys, templates or forms, educational or training materials and web-based intranets.

Requiring that operating systems provide accessibility features and clarifying that software and operating systems must interoperate with assistive technology. The standard includes requirements for software that directs the use and operation of technology for applications and mobile apps, operating systems and processes that transform or operate on information and data. These provisions cover the interoperability with assistive technology, applications and authoring tools, such as screen-magnification software and refreshable Braille displays.

Addressing access for people with cognitive, language and learning disabilities. There’s a recognition in the standard of addressing other disabilities, such as making sure there are not flashing lights on the screen that might cause difficulties for some people as well as making sure that distractions are reduced.

Harmonizing the requirements with international standards. In 2014, the European Commission adopted the “Accessibility requirements for public procurement of information and communications technology products and services in Europe” (EN 301 549). European governments can use the requirements as technical specifications or award criteria in public procurements of technology products and services. The Access Board has worked to ensure broad harmonization between its requirements and the European Commission’s standards.

While Section 508 is a mandate for materials produced by the federal government, the standard is applicable to any organization that wants to make materials more accessible. Being 508 compliant is simply good business. It will make it easier for all customers to find content, and for employees to work more easily. That’s a rule refresh maybe we should all think about.

To learn more about Section 508 and the refreshed rule, you can visit https//www.Section508.gov.

About the Author

Mark Gross is president of Data Conversion Laboratory (DCL).

Original at https://gcn.com/articles/2017/08/21/section-508-refresh.aspx

Low-Cost Smart Glove Translates American Sign Language Alphabet

By Liezel Labios, UC San Diego
Wednesday, July 12, 2017

“The Language of Glove”: a smart glove that wirelessly translates the American Sign Language (ASL) alphabet into text and controls a virtual hand to mimic ASL gestures.

Engineers at the University of California San Diego have developed a smart glove that wirelessly translates the American Sign Language alphabet into text and controls a virtual hand to mimic sign language gestures. The device, which engineers call “The Language of Glove,” was built for less than $100 using stretchable and printable electronics that are inexpensive, commercially available and easy to assemble. The work was published on July 12 in the journal PLOS ONE.

In addition to decoding American Sign Language gestures, researchers are developing the glove to be used in a variety of other applications ranging from virtual and augmented reality to telesurgery, technical training and defense.

“Gesture recognition is just one demonstration of this glove’s capabilities,” said Timothy O’Connor, a nanoengineering Ph.D. student at UC San Diego and the first author of the study. “Our ultimate goal is to make this a smart glove that in the future will allow people to use their hands in virtual reality, which is much more intuitive than using a joystick and other existing controllers. This could be better for games and entertainment, but more importantly for virtual training procedures in medicine, for example, where it would be advantageous to actually simulate the use of one’s hands.”

The glove is unique in that it has sensors made from stretchable materials, is inexpensive and simple to manufacture. “We’ve innovated a low-cost and straightforward design for smart wearable devices using off-the-shelf components. Our work could enable other researchers to develop similar technologies without requiring costly materials or complex fabrication methods,” said Darren Lipomi, a nanoengineering professor who is a member of the Center for Wearable Sensors at UC San Diego and the study’s senior author.

The ‘language of glove’

The team built the device using a leather athletic glove and adhered nine stretchable sensors to the back at the knuckles two on each finger and one on the thumb. The sensors are made of thin strips of a silicon-based polymer coated with a conductive carbon paint. The sensors are secured onto the glove with copper tape. Stainless steel thread connects each of the sensors to a low power, custom-made printed circuit board that’s attached to the back of the wrist.

The sensors change their electrical resistance when stretched or bent. This allows them to code for different letters of the American Sign Language alphabet based on the positions of all nine knuckles. A straight or relaxed knuckle is encoded as “0” and a bent knuckle is encoded as “1”. When signing a particular letter, the glove creates a nine-digit binary key that translates into that letter. For example, the code for the letter “A” (thumb straight, all other fingers curled) is “011111111,” while the code for “B” (thumb bent, all other fingers straight) is “100000000.” Engineers equipped the glove with an accelerometer and pressure sensor to distinguish between letters like “I” and “J”, whose gestures are different but generate the same nine-digit code.

The low power printed circuit board on the glove converts the nine-digit key into a letter and then transmits the signals via Bluetooth to a smartphone or computer screen. The glove can wirelessly translate all 26 letters of the American Sign Language alphabet into text. Researchers also used the glove to control a virtual hand to sign letters in the American Sign Language alphabet.

Moving forward, the team is developing the next version of this glove one that’s endowed with the sense of touch. The goal is to make a glove that could control either a virtual or robotic hand and then send tactile sensations back to the user’s hand, Lipomi said. “This work is a step toward that direction.”

Paper title: “The Language of Glove: Wireless gesture decoder with low-power and stretchable hybrid electronics” by Timothy F. O’Connor, Mathew Fach, Rachel Miller, Samuel E. Root, Patrick P. Mercier and Darren J. Lipomi, all at UC San Diego.

This work was supported by the National Institutes of Health Director’s New Innovator Award (1DP2EB022358-01). An earlier prototype of the device was supported by the Air Force Office of Scientific Research Young Investigator Program (grant no. FA9550-13-1-0156). Additional support was provided by the Center for Wearable Sensors at the UC San Diego Jacobs School of Engineering and member companies Qualcomm, Sabic, Cubic, Dexcom, Honda, Samsung and Sony.

Contact

Liezel Labios

UC San Diego
(858) 246-1124 llabios@ucsd.edu

Original at https://www.universityofcalifornia.edu/news/low-cost-smart-glove-translates-american-sign-language-alphabet

BlindSquare and BlindWays, Connecting the Dots for Travelers in Boston, Then and Now.

By Ile, April 24, 2017

Discovery and recollection.

Discovery and recollection are two necessary elements for travel for our friends who are blind. This presents a difficult task in a city the size of Boston, with its nearly 8,000 bus stops.

THEN:

It is necessary for all to move from point A to point B, whether home to work, work to play or home to “the necessaries” such as groceries, doctors’ offices, visiting friends or journeys to study. Public transit is a wonderful and beneficial asset. The physical movement of vehicles, coupled with information to support choices for travel, is an important service to the community at large. Still, for a person who is blind, partially-sighted or deafblind the “last few feet” can be a great void.

BlindSquare supports its adventurers with great guidance, based on information available from many sources, but the fact is that “B” (as a destination) is often approximate, leading its adventurers “quite close” but precision is elusive. Perkins School for the Blind took on the challenge to fill this information void.

BlindWays

Let’s consider a bus stop as a “dot” on the map. The focus of BlindWays is to describe “what’s inside the dot” yielding clear descriptions, using consistent language, providing a trustworthy resource for transit riders.

Travel, for a person who is blind, requires a lot of planning, discovery and recollection. That’s a lot to ask of a traveler in Boston, with nearly 8000 bus stops especially for an adventurer interested in exploring!

Perkins has created an iOS application called BlindWays that provides a means to collect the information from “thousands of discoveries,” from the contribution of hundreds of blind and sighted volunteers, into an application purposely crafted to provide the “micro-navigation” information necessary to lead the traveler closer and closer to the bus stop. The “discovery” is recorded using BlindWays and then is available to the entire community. The “recollection” is at the traveler’s fingertips, sensitive and responsive to their current location. Now, on approach, the adventurer can discover important information on the location of a trash container on the right, the location of a bus shelter on the left, the fact that the stop is “right across from the Fairmont Hotel” or other known points. All adding convenience and confidence for the traveler. All ensuring that when the bus arrives, the traveler is standing in the right spot, without the need to ask for assistance, and no longer “missing the bus.” Wonderful.

BlindSquare

Known to support travel globally, whether live “right now” or to support discovery by simulating travel “to future destinations” provides a yeoman-service daily, “connecting the dots” and presenting information of value and choices for its adventurers.

A vision was shared, between the people behind BlindSquare and BlindWays, to create a perfect case for blind travelers. We posed the question, “What would travel look” like if we could include BlindWays’ detailed descriptions with BlindSquare? The virus of the idea spread quickly and the partnering of the information was completed and announced at CSUN 2017 to great applause.

In Boston, and in many other communities, BlindSquare connects transit information and associates with “the dot,” known as the bus stop. We automatically identify the bus stop “dot” and supply information such as the stop number, expected arrival information about future buses and even interruptions of service often not available to sighted travelers. This is now much better in Boston, today!

NOW.

In Boston, augmenting the transit system information, we now connect to BlindWays’ crowdsourced descriptions of Massachusetts Bay Transportation Authority (MBTA) bus stops expanding the information about the “bus-stop-dot” with a simple gesture.

This information is readily available, freely, but on request, to accommodate the “frequent or first” traveler equally well. A frequent traveler is not interrupted with familiar information; a first-time traveler can readily unlock abundant information to support their journey.

Joann Becker, a Boston Transit rider, pictured above:

“I love the convenience of using one app which offers invaluable GPS information coupled with the micro navigation that BlindWays provides its users. I am able to look around with BlindSquare to choose a destination and then use BlindWays clues to help me get within a canes length of the bus stop! I feel empowered using these apps for independent travel around Boston!”

Can I try it myself? Of course!

  • What does this all sound like? An audio demonstration by Ilkka Pirttimaa, simulating a Boston bus stop can be found at the link below.
  • Try from home! One advantage that BlindSquare brings is the ability to simulate travel. From across the city, or around the globe, it’s easy to place yourself in a simulated location (such as Boston) and “look around”.
    Let’s simulate a location in Boston! Follow this link on your iPhone with BlindSquare installed and it will simulate a bus stop in Boston and prompt you to “shake your iPhone to hear BlindWays information”. Select “OK” then shake your phone and listen! You can advance to secondary information by shaking your phone again. You can, of course, pick a destination of your own! Have some fun as you adventure around Boston.
  • What if I don’t have BlindSquare? BlindSquare has provided a “free use” area covering all of Boston with BlindSquare Event, until July 1st. BlindSquare Event can be downloaded from the app store.

Original at http://blindsquare.com/2017/04/blindways/

How Do Deaf-Blind People Communicate?

American Association of the Deaf-Blind

Deaf-blind people have many different ways of communication.
The methods they use vary, depending on the causes of their combined vision and hearing loss, their backgrounds, and their education.

Below are some of the most common ways that deaf-blind people communicate. These methods described are used primarily in the United States.

Sign Language and Modifications

Signed Languages:

Some deaf or hard of hearing people with low vision use American Sign Language
or an English-based sign language. In some cases, people may need to sign or fingerspell more slowly than usual so the person with limited vision can see signs more clearly. Sometimes the person with low vision can see the signs better if the signer wears a shirt that contrasts with his or her skin color (e.g., a person with light skin needs to wear a dark-colored shirt).

Adapted Signs:

Some deaf-blind people with restricted peripheral vision may prefer the signer to sign in a very small space, usually at chest level. Some signs located at waist level may need to be adapted (e.g. signing “belt” at chest level rather than at waist level).

Tactile Sign Language:

The deaf-blind person puts his or her hands over the signer’s hands to feel the shape, movement and location of the signs. Some signs and facial expressions may need to be modified (for example, signing “not understand” instead of signing
“understand” and shaking one’s head; spelling “dog” rather than signing “dog”). People can use one-handed or two-handed tactile sign language.

People who grew up using ASL in the deaf community may prefer tactile ASL, while others who came from an oral background or learned signs later may prefer a more English-based tactile system.

Tracking:

Some deaf-blind people with restricted but still usable vision (e.g., tunnel vision) may follow signs by holding the signer’s forearm or wrist and using their eyes to follow the signs visually. This helps them follow signs more easily.

Tactile Fingerspelling:

Usually blind or visually impaired people who lose their hearing later, or deaf, or hard of hearing people who have depended on their speech reading and do not know how to sign, prefer tactile finger-spelling because sometimes sign language can be difficult to learn.

The deaf-blind person may prefer to put his or her hand over the fingerspelling hand, or on the signer’s palm, or cup his or her hand around the signer’s hand.

Speechreading

Tadoma:

This is a way for deaf-blind people with little or no usable vision to speech-read another person by touch. They put their thumb on the other person’s chin, and their fingers on the other person’s cheek to feel the vibrations of
the person’s voice and the movement of their lips. This method is rarely used nowadays.

Other deaf or hard of hearing people with usable vision use speechreadng as well as their residual vision and hearing. They may use hearing aids, cochlear implants and/or assistive listening devices to help them hear and understand other people better.

Face-to-Face Communication Systems

Screen Braille Communicator:

Some deaf-blind people use a Screen Braille Communicator (SBC).
This is a small, portable device that enables them to communicate with sighted people. The device has a QWERTY keyboard wotj an LCD display on one side, and an eight-cell braille display on the other side. The sighted person types short text on the QWERTY keyboard. The deaf-blind person reads the printed text by placing his or her fingers on the braille display. He or she then uses the braille display to type back text. The sighted person can read the text on the LCD display.

TTY with Braille Display:

The TTY is connected with and stacked on top of a braille display, although both can be separate. It allows a deaf-blind person who reads braille to use the telephone. The deaf-blind person can also use this system as a face-to-face
communication device to communicate with someone else who does not know the person’s preferred communication method.

Also, some people who don’t see well can use TTYs with large visual displays or computers with larger font to communicate with others.

Captel:

Some people with hearing and vision loss use CapTel to make telephone calls. Using a special phone, the CapTel USB, people can dial into a captioning service that types the other caller’s conversation onto a computer screen. Then,
deaf-blind callers can read a conversation script on their screens in addition to listening to another caller on their telephones. The captions can be adjusted for color, size or font style on the screen.

Braille Notetakers

Deaf-blind people can also use braille notetakers to communicate with others who don’t know braille or their communication system. Many braille notetakers can be connected with personal digital assistants (PDAs) that are commonly used by others.

ALTERNATE COMMUNICATION

Print on Palm (POP):

The person communicating with the deaf-blind person prints large block letters on the other person’s palm. Each letter is written in the same location on the person’s palm. This is frequently a way for deaf-blind people to communicate with the public.

These are only a few of the many ways that deaf-blind people can communicate with each other and with others. For more specific information, contact the AADB Office.

Original at http://www.aadb.org/factsheets/db_communications.html/h3>

Legislation Will Improve Access to Copyrighted Materials for Visually Impaired and Print-Disabled Canadians

OTTAWA, ONTARIO–Marketwired – June 23, 2016
Innovation, Science and Economic Development Canada

Canadians who are visually impaired or print disabled will have better access to books and other copyrighted materials. The Honourable Navdeep Bains, Minister of Innovation, Science and Economic Development, today announced that the Act to Amend the Copyright Act (access to copyrighted works or other subject-matter for persons with perceptual disabilities) has received royal assent.

The amendments to the Copyright Act enable Canada to be among the first countries in the world to accede to the Marrakesh Treaty to Facilitate Access to Published Works for Persons Who Are Blind, Visually Impaired or Otherwise Print Disabled.

By bringing the country’s copyright law in line with the Treaty, Canada has shown leadership in ensuring a wider availability of books and other materials for Canadians with visual impairments and print disabilities.

Quotes

“I am proud that this legislation is coming into force. Improving access to books and other copyrighted materials for the visually impaired and print disabled is a priority for our government. I am pleased that, with the passage of this legislation, Canada is leading by example in making the world a more accessible place.”

– The Honourable Navdeep Bains, Minister of Innovation, Science and Economic Development

“I am honoured that our government is standing up for Canadians with disabilities. With the Act coming into force, Canadians with print disabilities will have more equitable access to alternative-format published materials and will benefit from greater accessibility and opportunities in their communities and workplaces.”

– The Honourable Carla Qualtrough, Minister of Sport and Persons with Disabilities

“I am proud that the government has taken action to ensure that visually impaired and print-disabled persons have access to the latest and best published materials from around the world. This legislation will allow Canadians to participate fully and actively in our society, and it will contribute to the development of our inclusive economy.”

РThe Honourable M̩lanie Joly, Minister of Canadian Heritage

“CNIB is thrilled Bill C-11, which implements the Marrakesh Treaty, has received royal assent. This is an important milestone for Canadians with vision loss and other print disabilities. Access to literature is a human right and the Marrakesh Treaty will provide greater access to published literature in alternate formats, unlocking opportunities for education and employment and providing equal access to all Canadians.”

– Diane Bergeron, Executive Director, Strategic Relations and Engagement, CNIB

Quick facts

  • The Marrakesh Treaty, an international treaty administered by the World Intellectual Property Organization, was adopted in Marrakesh in 2013.
  • It establishes standardized exemptions to copyright laws, allowing people to reproduce copyright-protected works in accessible formats and to import or export them.
  • Once 20 countries have joined, the Marrakesh Treaty will come into effect. As of June 23, 2016, 17 countries have ratified or acceded to the Treaty.

Follow Minister Bains on social media.

Twitter: @MinisterISED

Contact Information

Philip Proulx
Press Secretary
Office of the Minister of Innovation,
Science and Economic Development
343-291-2500

Media Relations

Innovation, Science and Economic Development
343-291-1777
ic.mediarelations-mediasrelations.ic@canada.ca

Reproduced from http://www.marketwired.com/press-release/legislation-will-improve-access-copyrighted-materials-visually-impaired-print-disabled-2137150.htm

Technology is Failing to Meet the Needs of Older People With Hearing and Sight Problems, Report Finds

10 March 2016

Assistive technology developers need to do more to meet the diverse needs of deafblind By 2030, the UK is likely to have 570,000 people with hearing and sight problems

Assistive technology developers and service providers need to do more to meet the diverse needs of the rising number of older people with both hearing and sight problems, according to a new report launched at the University of Sheffield today (10 March 2016).

medium: The study – Keeping in Touch with Technology was commissioned in 2014 by Sense, the national charity for deafblind people, to explore the experiences of older people using telecare and assistive technology.

Growing numbers of people with sight and hearing problems are living in the community and seven in ten of those are aged over 70. By 2030, the UK is likely to have 570,000 people with hearing and sight problems, including 418,000 people over 70 and 245,000 people with severe impairments.

To find out more about their experiences, researchers made repeat visits over several months to 38 older people using telecare or other technology to help with their hearing and vision.

They found:

  • When it suits the abilities of an older person with both sight and hearing difficulties, technology can have many benefits and help them in their everyday lives
  • However, service providers, assistive technology suppliers and product developers need to do more to meet the diverse needs of the rising number of older people with both hearing and sight problems
  • Some people in the study had negative attitudes towards technology, but these were not the main thing stopping them from effectively using technology
  • Limited knowledge and low awareness of available equipment and technology, and a lack of information about how to obtain it, were common problems
  • Few items of equipment seemed to have been designed for those with both sight and hearing problems.

Professor Sue Yeandle, Professor of Sociology and Director of the Centre for International Research on Care, Labour and Equalities (CIRCLE) at the University of Sheffield, led the study with the Universities of Leeds and Oxford.

The report will be launched during an event attended by former Home Secretary the Rt Hon Lord Blunkett.

Professor Yeandle said: “Huge strides have been made in technologies which can help people with sight, hearing and other difficulties communicate with others and live well and independently. But too few of the growing number who could benefit get this equipment and many of them lack items designed with their needs and lifestyles in mind, or don’t get the follow-up support needed to use them.

“We are working with Sense to ensure our findings are communicated to everyone involved in developing, supplying and planning support using technology. Our conference today will focus on exploring practical ways of giving older people with dual sensory impairment a stronger voice, and ensuring their needs and aspirations are met.”

In the study, most of the people with DSI used equipment to summon help or alert them to something like a pendant alarm or flashing beacons linked to a smoke alarm. About a third had technology to help them hear, aside from hearing aids, and two thirds had devices to help them see. Some also had specialist ICT or telecommunications equipment, and away from home accessible GPS devices helped some to travel and access activities.

People with sight and hearing problems are entitled to a specialist assessment, but some participants reported difficulty in obtaining one and others said their focus was almost exclusively on risk and safety rather than what was important to them. Most telecare and ‘alerting technology’ had been supplied by local authorities. A few people had been referred by the NHS to social services and some had been supplied with hearing aids, magnifiers and talking blood glucose monitors through the NHS. Charitable organisations had also provided some participants with equipment, support and advice relating to technology and most were happy with this service.

However, some people were sceptical or concerned about using telecare or assistive technology, either because of past, negative experiences with equipment or a fear it might mark them out as vulnerable. Some felt they were too old to learn, while others were enthusiastic and said technology helped them manage everyday chores, journeys and routines.

The study found alerting technologies, like pendant alarms, improved people’s relationships by reducing concern about risk, particularly if their family or friends were concerned about their safety.

However, participants noted some barriers to using assistive technology, including:

  • Not knowing what is available or where to get it from
  • The cost and choice, with many feeling confused or worried about different prices and the many products available for private purchase
  • A limited choice of products being available from their local authority
  • The available equipment failing to fit their needs and feeling they were forced to ‘compromise’
  • Lacking the necessary guidance on using equipment when it was first supplied, if any difficulties arose, or when their circumstances changed.

Additional information

The University of Sheffield

With almost 26,000 of the brightest students from around 120 countries, learning alongside over 1,200 of the best academics from across the globe, the University of Sheffield is one of the world’s leading universities.

A member of the UK’s prestigious Russell Group of leading research-led institutions, Sheffield offers world-class teaching and research excellence across a wide range of disciplines.

Unified by the power of discovery and understanding, staff and students at the university are committed to finding new ways to transform the world we live in.

In 2014 it was voted number one university in the UK for Student Satisfaction by Times Higher Education and in the last decade has won four Queen’s Anniversary Prizes in recognition of the outstanding contribution to the United Kingdom’s intellectual, economic, cultural and social life.

Sheffield has five Nobel Prize winners among former staff and students and its alumni go on to hold positions of great responsibility and influence all over the world, making significant contributions in their chosen fields.

Global research partners and clients include Boeing, Rolls-Royce, Unilever, AstraZeneca, Glaxo SmithKline, Siemens and Airbus, as well as many UK and overseas government agencies and charitable foundations.

For further information, please visit http://www.sheffield.ac.uk

Sense

Sense is a national charity that has supported and campaigned for children and adults who are deafblind for over 60 years. There are currently around 250, 000 deafblind people in the UK. Sense provides specialist information, advice and services to deafblind people, their families, carers and the professionals who work with them. Further information can be found on Sense’s website www.sense.org.uk

Contact

For further information please contact:

Hannah Postles
Media Relations Officer
University of Sheffield
0114 222 1046
h.postles@sheffield.ac.uk

Reproduced from http://www.sheffield.ac.uk/news/nr/sense-report-deafblind-technology-1.557568