More than 67% of IT employees say cyberbullying incidents at their schools have increased as students spend more time on laptops, Chromebooks, and other devices. Nearly 80% of those IT employees said cyberbullying behaviors sometimes happen on a device provided to students by the school.
Of course, bullying—and even cyberbullying—wasn’t created when schools switched to some form of distance learning in response to the COVID-19 pandemic. But as schools shift from in-person classrooms (and congested bathrooms and hallways), bullies are adapting, too.
Just because cyberbullying is new doesn’t mean school leaders are going to ignore it. They want to be alerted to actions and respond to protect their students. And not just from cyberbullying; another study found that 45% of children have seen pornography before age 13, and 17% of high school students have seriously considered suicide.
< p style="color:#fff;">Your school is charged with taking care of the whole student, not just shoving information into their head. So, you likely want to know when (not if) your students are engaging in inappropriate behavior, showing signs of depressive or anxiety disorders, or looking at a site with inappropriate content.
But how can your teachers and administrators know what’s going on in a student’s head—especially in a remote learning situation? How can you supervise collaborative and online communications? How can you protect students from inappropriate or illicit online content at home or school?
With an increasingly encrypted web, it’s trickier to get the information you need to protect students. Getting search term reports, creating selective access to Google services, and tapping into YouTube controls all require SSL decryption. But more than 60% of school IT personnel say they aren’t decrypting SSL at all. Completely blocking YouTube and social media is always an option, but those platforms can often be used for learning in today’s (in-person and remote) classrooms. That’s why 40% of schools allow YouTube access for everyone.
Even when schools use filtering on their devices and networks (which isn’t always), that filtering usually has limitations. They typically only block about half of the pornography on the web. They often send too much noise—and too many false alerts—when trying to analyze websites in real-time. And often, schools have to rely on separate filtering solutions for when devices are on campus or off, which adds complexities to the process.
Why paint this gloomy picture? Because we know how you can get help to erase that picture and replace it with one where students are safe and you have the knowledge to help them stay that way.
Planning for educational continuity in a time of remote and hybrid learning means creating a seamless learning experience regardless of where students are learning. Part of ensuring that continuity is providing the content filtering and device management tools necessary for student safety online.
Lenovo is committed to helping schools through this digital transformation that is required for educational continuity. That’s why we’re partnering with Lightspeed to provide schools worldwide with an ecosystem of cloud-based solutions with reliable filtering, analytics, and device management tools. This includes advanced artificial intelligence (AI) to monitor, interpret, and flag warning signs in emails, Google docs, social media posts, web searches, Microsoft Teams and Meetings chat—really, nearly everywhere students are interacting online.
Artificial intelligence made real to combat inappropriate content
Your IT team can use Lightspeed’s AI to automatically block millions of inappropriate, harmful, and unknown sites, images, and videos. You can scale student safety with cloud-based, device-level protections that work across all devices, operating systems, and learning environments.
Lightspeed Filter is the only content-filtering software for schools that incorporates each of the four recognized components for maximum student online safety:
- A comprehensive, vetted content database
- Vigilant, always-on web crawling
- Segment-leading machine learning
- An exclusive team of in-house data scientists to analyze, interpret, and adapt the solution to new developments and threats
Its SmartPlay’s patented agents and mature database effectively block millions of inappropriate videos, thumbnails, and “recommended by” YouTube content so that YouTube can be a safe teaching tool. Customizable policy controls enable admins to set parameters and selectively permit content without being overly restrictive.
The information to head off violent incidents
Lightspeed Alert uses sophisticated AI crawlers, data science, and statistical methodologies to locate, contextually interpret, and surface signs of student violence, cyberbullying, and self-harm, allowing your school to intervene before incidents occur.
Lightspeed Alert takes advantage of its 20-plus years’ experience in online student protection, deploying an enormous proprietary database of threats and online student behavior patterns to home in on actual red flags for self-harm or the potential for violence. Lightspeed Alert has proven to be 30% more accurate than competing solutions.
Here’s the critical question
Lightspeed’s ecosystem of cloud-based solutions is rooted in expertise and innovation so your school can better educate and safeguard students.
Among schools that use Lightspeed Analytics, nearly 80% have identified students looking at pornography; more than 40% have identified cheating and violence threats. Forty percent identified cyberbullying, and almost 40% identified talk of suicide.
What would you discover—and prevent—with Lightspeed on your Lenovo devices? Now may be the time to find out. Read more about our best-in-class solution suite and our customer experiences in our Solutions for Education Catalog.
Contact us at email@example.com for more information.