# Online, Website, And Software Platform Experience In The Review Corpus

This report looks at the digital front door in the review corpus: websites, online forms, apps, and named systems such as `PATCHS`, `AskMyGP`, `eConsult`, `Accurx`, and the `NHS App`.

The main difficulty is still the same one: patients often do not know or use the software brand name. They say "the website", "the online form", "the app", or "the system". So the right way to read this is still broad first, named tools second.

This pass uses the rebuilt review index, strips practice-response text where possible, and stays focused on what patients themselves wrote.

## Headline

The digital layer is now a real part of the patient experience, not a side issue.

In the rebuilt `40,506`-review corpus:

- `2,586` reviews, `6.4%` of all reviews, mention a website, online route, app, named platform, or a recognisable digital-platform issue
- `1,803` mention the generic online or website layer
- only `382` explicitly name one of the main platforms or apps in this pass (`PATCHS`, `AskMyGP`, `eConsult`, `Accurx`, `NHS App`)

That still means most of the digital experience is not being described by product name. Patients mostly talk about a web route they have to use, not the vendor behind it.

## Generic Coverage First: "Online", "Website", "The Form"

The generic layer is still much bigger than any named platform.

The broad `online / website / online form / online booking / online triage` bucket now contains:

- `1,803` reviews total
- `903` negative reviews
- `847` positive reviews
- `53` mixed reviews

So the digital front door is not uniformly bad. It has a real positive side when it works. But negative experiences still outweigh positive ones.

Practices with especially visible digital-review volumes include:

| Practice | Digital reviews | Share of all reviews | Negative | Positive |
| --- | ---: | ---: | ---: | ---: |
| `The Brooke Surgery` | `118` | `9.4%` | `14` | `103` |
| `Ashton Medical Group` | `110` | `11.2%` | `49` | `49` |
| `Millgate Healthcare Partnership` | `106` | `10.1%` | `27` | `77` |
| `Chorlton Family Practice` | `93` | `10.4%` | `20` | `63` |
| `Cheadle Medical Practice` | `53` | `11.0%` | `15` | `35` |
| `The Sides Medical Practice` | `37` | `13.0%` | `3` | `34` |
| `Ashville Surgery` | `36` | `14.0%` | `5` | `29` |
| `LADYBARN GROUP PRACTICE` | `35` | `9.5%` | `6` | `28` |

This is still the main digital pattern in the corpus: the same kind of online route can be described as quick, modern, and convenient in one practice, and as another barrier in another.

## Named Platforms: Visible, But Still Under-Named

### `PATCHS`

Explicit `PATCHS` mentions are still present but not huge:

- `58` reviews total
- `21` negative
- `32` positive
- `5` mixed

The balance is genuinely mixed.

Good `PATCHS` reviews usually talk about speed and efficiency:

> "Having used the PATCHS system several times now, I would like to say how quick, easy and efficient the system has been."  
> `Norden Branch Surgery`, `2 months ago`

Bad `PATCHS` reviews usually talk about visibility, access, or exclusion:

> "No record of submission using PATCHs."  
> `Chorlton Family Practice`, `6 months ago`

> "Receptionist rude insisting father has to use patchs when he’s no good on a smartphone or computer."  
> `Family Surgery`, `3 months ago`

So `PATCHS` still reads like a high-variance system. Where the workflow behind it works, patients praise it. Where the surrounding setup is weak, it becomes another locked gate.

### `AskMyGP`

`AskMyGP` remains the most visible named patient platform:

- `140` reviews total
- `57` negative
- `80` positive
- `3` mixed

That is still split, but now slightly more positive than negative.

Good `AskMyGP` reviews usually say:

- same-day response
- quick advice
- no need to fight the phones
- easy to use when backed by real follow-through

Example:

> "do ask my gp on line, always get a response on same day"  
> `The Brooke Surgery`, `Edited 2 years ago`

Bad `AskMyGP` reviews still say:

- it is always closed
- it fills too quickly
- requests get closed off rather than acted on
- phone messages point patients to a route that is not really open

Examples:

> "They do not respond to emails, and AskMyGP is always closed."  
> `Tower Family Healthcare`, `2 years ago`

> "My requests on AskmyGP just get closed off with a link to a pharmacy."  
> `Ashville Surgery`, `a month ago`

So `AskMyGP` still looks highly practice-dependent. It can feel fast and effective when the practice is staffed to use it properly. It feels much worse where it becomes a shut door or a dead end.

### `eConsult`

`eConsult` is still lightly named directly:

- `36` reviews total
- `17` negative
- `19` positive

That low count should not be read as low use. It probably still means many patients experience it simply as "the online form".

### `Accurx`

`Accurx` is still barely named as a patient-facing brand:

- `6` reviews total
- `2` negative
- `3` positive
- `1` mixed

That is not absence. It is obscurity. Patients often meet the route through the practice website rather than through the product name.

### `NHS App`

The `NHS App` remains one of the more visible named systems:

- `142` reviews total
- `60` negative
- `74` positive
- `8` mixed

This is another genuinely split platform.

Positive reviews talk about:

- easy contact
- easy booking
- quick responses
- records and prescription convenience

Example:

> "I have always found contacting the practice using the NHS app easy and quick with a fantastic response from the staff."  
> `The Sides Medical Practice`, `9 months ago`

Negative reviews talk about:

- messaging being disabled
- no appointments available
- being pushed into another route from the app
- confusion about whether the app is actually meant to work

Example:

> "It says on their website that you can contact them via the NHS app but ... messaging [is] disabled"  
> `Heaton Moor Medical Group`, `a year ago`

The `NHS App` often reads less like a full front door than like a relay point between systems.

## The Main Digital Issue Types

### 1. Speed and convenience when the system works

This is the strongest positive theme by far:

- `408` reviews in the speed/convenience bucket
- `316` positive
- `85` negative

This is the best case for digital routes.

Patients praise them when they deliver:

- same-day appointments
- same-day callbacks
- quick responses
- a route that works without the `8am` phone fight

Examples:

> "Using online form for appointment easy and obtained same day appointment"  
> `Chorlton Family Practice`, `8 months ago`

> "I have always found contacting the practice using the NHS app easy and quick"  
> `The Sides Medical Practice`, `9 months ago`

The positive digital model in the corpus is still very clear: the request goes in, a human responds quickly, and the patient gets seen.

### 2. Usability and instruction failure

This is one of the clearest negative themes:

- `112` reviews total
- `72` negative
- `31` positive
- `9` mixed

Patients often do not complain about the idea of digital care. They complain that the route is badly explained, badly linked, or awkward to navigate.

Common wording includes:

- no instructions
- hard to find
- not obvious where to click
- difficult to use
- confusing setup

The recurring complaint here is not mainly clinical. It is design failure.

### 3. No reply, lost submission, or silent failure

This remains one of the worst themes by feel:

- `54` reviews
- `50` negative

These are the reviews where the patient does the right digital thing and then nothing happens.

Examples:

> "No record of submission using PATCHs."  
> `Chorlton Family Practice`, `6 months ago`

Patients describe:

- no record of submission
- no response after sending
- no callback
- requests disappearing into the system

This is a key trust problem. A bad website is one thing. A form that appears to work and then silently fails is worse.

### 4. Closed forms, narrow windows, and digital queueing

This is still a smaller but sharp theme:

- `24` reviews
- `22` negative

The digital route often reproduces the old `8am` rush instead of replacing it.

Patients describe:

- forms only open for a short window
- systems already full by `8am`
- `AskMyGP` or `PATCHS` being closed
- online routes acting like just another queue

So one of the clearest failures is not that software exists, but that scarcity has been turned into a software timer.

### 5. Triage burden and self-diagnosis pressure

This bucket is still fairly small but distinctive:

- `16` reviews
- `11` negative
- `3` positive
- `2` mixed

The wording here is sharp:

- too many questions
- feeling left to self-diagnose
- triage software seeming inappropriate for urgent need

This is where digital triage tips from inconvenience into distrust.

### 6. Digital exclusion

This is still a smaller explicit bucket:

- `14` reviews
- `12` negative

But it is probably under-stated, because many exclusion complaints are written indirectly rather than with neat keywords.

Examples are still about:

- older patients
- disability
- not being able to use computers or smartphones
- being forced into a route they cannot realistically use

The `PATCHS` example about an elderly patient being pushed onto a smartphone route is exactly the kind of complaint that makes this visible.

## What Patients Infer Even When They Never Name The Platform

Most patients still do not say "this is `Accurx`" or "this is `eConsult`". They say:

- the website
- the online form
- the app
- the system
- the link

That means the real comparison in the corpus is still not mainly vendor versus vendor. It is between kinds of patient experience.

### Good unnamed digital experience

- request sent quickly
- same-day callback
- same-day appointment
- easier than ringing at `8am`
- clear enough to use without help

### Bad unnamed digital experience

- no instructions
- forced onto a website after the phone fails
- online route only open in a narrow window
- no response after submitting
- confusing handoff between app, practice site, and platform
- feeling forced to self-triage

So the practical patient comparison is still not `PATCHS` versus `AskMyGP` in a tidy vendor sense. It is whether the digital route behaves like a quick bridge to care, or like another wall.

## Practices Where The Digital Layer Looks Better

The more clearly positive digital clusters in the rebuilt corpus include:

- `The Brooke Surgery`
- `The Sides Medical Practice`
- `The Range Medical Centre`
- `Ashville Surgery`
- `LADYBARN GROUP PRACTICE`
- parts of `Chorlton Family Practice`

What they have in common in patient reviews:

- lots of same-day or fast-response stories
- online forms described as easy or efficient
- digital routes backed by real human follow-through
- patients sometimes explicitly compare the practice favourably with worse surgeries they have used before

The point is still not the platform alone. It is whether the workflow behind it appears to work.

## Practices Where The Digital Layer Looks Worse

The more negative digital clusters now include:

- `Florence House Medical Practice`
- `The Robert Darbishire Practice`
- `Delamere Medical Practice`
- parts of `Ashton Medical Group`
- practices where the app, website, and phone routes seem to push patients back into each other

What these negative clusters have in common:

- more negative than positive digital mentions
- complaints about the website or form replacing human contact
- confusion about where to go, what link to use, or which system is active
- closed, unavailable, or unresponsive digital routes
- digital systems being experienced as another barrier rather than an easier route in

## Bottom Line

The online/web layer in this corpus is real, widespread, and mixed.

The strongest single finding is still that patients usually do not name the software. They talk about a generic digital front door. That front door now shows up in `2,586` reviews and is described in two very different ways.

When it works, patients love it:

- same-day appointments
- quick callbacks
- no need to fight the phone queue

When it fails, they do not talk like software users. They talk like blocked patients:

- the website does not work
- the form is closed
- there is no response
- they are forced into a route they cannot use
- one system sends them to another
- nobody is available except the software

So the real divide here is still not vendor versus vendor. It is whether the digital route is actually connected to care, or whether it is just a new way of being shut out.
