diff --git a/app/assets/stylesheets/_variables.scss b/app/assets/stylesheets/_variables.scss index 51afa04b5..d412b3ec7 100644 --- a/app/assets/stylesheets/_variables.scss +++ b/app/assets/stylesheets/_variables.scss @@ -18,6 +18,7 @@ $danger: #EB5959; $success: #2ECC71; $info: #58A09A; $brand: #4B68FF; +$brand-comp: #F05137; $data: ( diff --git a/app/assets/stylesheets/complaints.scss b/app/assets/stylesheets/complaints.scss index fdec7e561..8c679c338 100644 --- a/app/assets/stylesheets/complaints.scss +++ b/app/assets/stylesheets/complaints.scss @@ -60,3 +60,44 @@ .is-lead + h1.complaint-title { margin-top: -1rem; } + +.is-brand-comp { + color: $brand-comp; +} + +.with-subtitle { + margin-bottom: 0; +} + +.subtitle { + margin-top: 0; + font-weight: normal; + font-size: 1.2rem; +} + +.modules { + display: grid; + grid-template-columns: repeat(auto-fit, minmax(20rem, 1fr)); + gap: 0.5rem; +} + +.module-widget { + border: 1px solid $muted-graphic; + border-radius: 0.2rem; + padding: 0.5rem; + color: $key !important; + + > h4 { + text-decoration: underline; + } + + > p { + text-decoration: none; + } + + &:hover { + background: $primary; + color: white !important; + text-decoration: none !important; + } +} diff --git a/app/controllers/complaints_controller.rb b/app/controllers/complaints_controller.rb index 3d8af4507..953cd9353 100644 --- a/app/controllers/complaints_controller.rb +++ b/app/controllers/complaints_controller.rb @@ -3,6 +3,7 @@ class ComplaintsController < ApplicationController before_action :access_check, only: [:show, :comment] before_action :write_access_check, only: [:self_assign, :update_status, :change_content_type] before_action :verify_staff, only: [:reports, :reporting] + before_action :training_access, only: [:training] def index render layout: 'without_sidebar' @@ -202,6 +203,16 @@ def reporting render layout: 'without_sidebar' end + def training + pages = Dir.glob(Rails.root.join('app', 'views', 'complaints', 'training', '*.html.erb')) + .map { |page| File.basename(page, '.html.erb') } + if pages.include?(params[:page]) + render "complaints/training/#{params[:page]}", layout: 'osa_training' + else + not_found! + end + end + private def access_check @@ -235,4 +246,10 @@ def set_complaint @complaint end + + def training_access + unless user_signed_in? && (current_user.staff? || current_user.at_least_moderator?) + not_found! + end + end end diff --git a/app/views/complaints/report.html.erb b/app/views/complaints/report.html.erb index 730ddedba..f608f10eb 100644 --- a/app/views/complaints/report.html.erb +++ b/app/views/complaints/report.html.erb @@ -6,7 +6,8 @@
Thank you for taking the time to make a report. If you've seen harmful, abusive, or illegal content on our communities, you can report this to us here. You can also use this page if you've received a message saying we've - classified your content as harmful, abusive, or illegal and you wish to contest it. + classified your content as harmful, abusive, or illegal and you wish to contest it, or if you have a complaint + about our processes or our compliance with our duties.
Last updated 12 March 2026
+ ++ It is important to understand what each type of content is exactly, so they are defined here. These are definitions + which we have written: the Act defines each type of content in terms of related criminal offences, which is more + complex than is necessary to deal with the content, so these descriptions are intended to provide a simple overview. +
+ ++ Our risk assessment has also identified some additional types of illegal content based on the platform's risk profile. +
+ ++ As part of our responsibilities under the Online Safety Act, we're obligated to provide training to all staff and + volunteers undertaking moderation duties. +
++ Take your training here. Completing all the modules will record that you have completed the training, but you can + revisit these pages at any time if you need guidance when making moderation decisions. +
+ ++ An overview of the Online Safety Act, our duties, and your responsibilities as a volunteer moderator. +
+ <% end %> + <%= link_to osa_training_path('illegal-content'), class: 'module-widget' do %> ++ An explanation of the difference between the 17 types of priority illegal content, and other applicable types of + non-priority illegal content. +
+ <% end %> + <%= link_to osa_training_path('definitions'), class: 'module-widget' do %> ++ Definitions of all the types of illegal content which apply to us. +
+ <% end %> + <%= link_to osa_training_path('handling'), class: 'module-widget' do %> ++ Your responsibilities and the steps you need to take in response to identifying potentially illegal content. +
+ <% end %> + <%= link_to osa_training_path('higher-risk'), class: 'module-widget' do %> ++ Some types of content are more likely to occur in our communities than others. More detail on those here. +
+ <% end %> + <%= link_to osa_training_path('conclusion'), class: 'module-widget' do %> ++ Thank you for taking the time to complete this training. Mark it as complete here and come back here if you need + to refer back to it. +
+ <% end %> +Last updated 12 March 2026
+ ++ The Act sets out 17 types of priority illegal content, and a number of types of non-priority illegal content. We have + carried out a risk assessment for all types of priority illegal content, and applicable types of non-priority illegal + content, which details the likelihood and impact of each type of content on our platform specifically. +
+ + + +Last updated 12 March 2026
+ ++ The Online Safety Act 2023 (available here) is a law + established in the UK in 2023 with the aim of improving online safety, particularly with regard to children, but with + wide-ranging effects for all online services. All services with UK users are required to comply with the Act. There + are ongoing cases which will define whether this is enforceable on non-UK entities in practice, but because Codidact + is a UK-based entity, we are clearly within scope and required to comply. +
++ The Act is enforced by the UK's communications regulator, Ofcom, which also sets out the Register of Risks and Codes + of Practice on which our approach is based. +
+ ++ The Act defines two types of service: search services and user-to-user services. User-to-user services are those where + users may interact with one another; this is where we fall. There are different requirements imposed on each kind of + service, which for user-to-user services primarily focus on preventing and removing harmful content, and protecting + users from related harms. +
+ ++ Responsibility for compliance with the requirements of the Act obviously falls on us (meaning the Codidact Foundation + as the organisation running the platform). The Foundation designates one of the Board of Directors as a named + individual with ultimate responsibility for compliance with the Act, which is currently + ArtOfCode. +
++ One of our responsibilities is to ensure that our volunteer moderators (that's you) have an awareness of the Act and + are provided with appropriate training in order to equip them to handle any harmful content which may appear on the + platform. +
+ ++ As a volunteer moderator, your job is to guide, curate, and set the tone for your community. Part of that job is + protecting the community from any unwanted content. The majority of the time, that might take the form of off-topic + posts, arguments between users, or handling flags for your attention. Unfortunately, it may also take the form of + harmful or illegal content covered by the Act, and one of your responsibilities is to ensure this is dealt with and + escalated appropriately. +
++ To be clear: we're not expecting you to handle harmful or illegal content alone. Our ask of you is + simple: if you identify something that you think would be covered by the Act, please: +
+