cover image European Journal of Neurology

European Journal of Neurology

2022 - Volume 29
Issue 4 | April 2022

ORIGINAL ARTICLE

Background and purpose

Superficial siderosis of the central nervous system is a sporadic finding in magnetic resonance imaging, resulting from recurrent bleedings into the subarachnoid space. This study aimed to determine the frequency of spinal dural cerebrospinal fluid (CSF) leaks amongst patients with a symmetric infratentorial siderosis pattern.

Methods

In all, 97,733 magnetic resonance images performed between 2007 and 2018 in our neurocenter were screened by a keyword search for “hemosiderosis” and “superficial siderosis.” Siderosis patterns on brain imaging were classified according to a previously published algorithm. Potential causative intracranial bleeding events were also assessed. Patients with a symmetric infratentorial siderosis pattern but without causative intracranial bleeding events in history were prospectively evaluated for spinal pathologies.

Results

Forty‐two patients with isolated supratentorial siderosis, 30 with symmetric infratentorial siderosis and 21 with limited (non‐symmetric) infratentorial siderosis were identified. Amyloid angiopathy and subarachnoid hemorrhage were causes for isolated supratentorial siderosis. In all four patients with a symmetric infratentorial siderosis pattern but without a causative intracranial bleeding event in history, spinal dural abnormalities were detected. Dural leaks were searched for in patients with symmetric infratentorial siderosis and a history of intracranial bleeding event without known bleeding etiology, considering that spinal dural CSF leaks themselves may also cause intracranial hemorrhage, for example by inducing venous thrombosis due to low CSF pressure. Thereby, one additional spinal dural leak was detected.

Conclusions

Persisting spinal dural CSF leaks can frequently be identified in patients with a symmetric infratentorial siderosis pattern. Diagnostic workup in these cases should include magnetic resonance imaging of the whole spine.

ORIGINAL ARTICLE

Background and purpose

To evaluate the effect of menopause on disability accumulation in women followed from their clinically isolated syndrome (CIS).

Methods

We examined the longitudinal changes in Expanded Disability Status Scale (EDSS) scores from CIS until the last follow‐up in women belonging to the Barcelona CIS prospective cohort, followed through their menopausal transition. The analysis is based on 13,718 EDSS measurements, with an average of 28 EDSS measurements per patient. Differences in EDSS trajectories between menopausal and nonmenopausal women, controlling for age and disease duration, were evaluated. We performed two sensitivity analyses in women with confirmed MS and in those experiencing early menopause.

Results

From 764 eligible women, 496 (65%) responded to the questionnaire, and 74 (14.9%) reached menopause over the follow‐up. We did not find a significant inflection point in EDSS trajectories around menopause (slope change −0.009; 95% CI −0.066; 0.046). The annual increase in EDSS over the complete course of the disease was significantly higher in menopausal women (0.049; 95% CI, 0.026–0.074) versus nonmenopausal (0.019; 95% CI, 0.008–0.031; interaction value 0.025). This difference was lost when controlling for age and disease duration (EDSS annual increase of 0.059; 95% CI, 0.025–0.094 vs. 0.038; 95% CI, 0.021–0.057, respectively; interaction value 0.321). No inflection point was detected when the analysis was restricted to women with confirmed MS or with earlier menopause.

Conclusions

Menopause is not associated with an increased risk of disability in a CIS population, considering EDSS trajectories throughout the course of the disease together with age and disease duration.

SHORT COMMUNICATION

Background and purpose

Requiring a walking aid is a fundamental milestone in multiple sclerosis (MS), represented by an Expanded Disability Status Scale (EDSS) score ≥6.0. In the present study, we assess the effect of ocrelizumab (OCR) on time to EDSS score ≥6.0 in relapsing MS.

Methods

Time to EDSS score ≥6.0 confirmed for ≥24 and ≥48 weeks was assessed over the course of 6.5 years (336 weeks) in the double‐blind period (DBP) and open‐label extension (OLE) period of the OPERA I (NCT01247324) and OPERA II (NCT01412333) studies.

Results

Time to reach EDSS score ≥6.0 was significantly delayed in those initially randomized to OCR versus interferon. Over 6.5 years, the risk of requiring a walking aid confirmed for ≥24 weeks was 34% lower among those who initiated OCR earlier versus delayed treatment (average hazard ratio [HR] DBP + OLE 0.66, 95% confidence interval [CI] 0.45–0.95;  = 0.024); the risk of requiring a walking aid confirmed for ≥48 weeks was 46% lower (average HR DBP+OLE 0.54, 95% CI 0.35–0.83;  = 0.004).

Conclusion

The reduced risk of requiring a walking aid in earlier initiators of OCR demonstrates the long‐term implications of earlier highly effective treatment.

ORIGINAL ARTICLE

Background and purpose

Reaching Expanded Disability Status Scale (EDSS) ≥7.0 represents the requirement for a wheelchair. Here we (i) assess the effect of ocrelizumab on time to EDSS ≥7.0 over the ORATORIO (NCT01194570) double‐blind and extended controlled periods (DBP+ECP), (ii) quantify likely long‐term benefits by extrapolating results, and (iii) assess the plausibility of extrapolations using an independent real‐world cohort (MSBase registry; ACTRN12605000455662).

Methods

Post hoc analyses assessing time to 24‐week confirmed EDSS ≥7.0 in two cohorts of patients with primary progressive multiple sclerosis (baseline EDSS 3.0–6.5) were investigated in ORATORIO and MSBase.

Results

In the ORATORIO DBP+ECP, ocrelizumab reduced the risk of 24‐week confirmed EDSS ≥7.0 (hazard ratio = 0.54, 95% confidence interval [CI]: 0.31–0.92;  = 0.022). Extrapolated median time to 24‐week confirmed EDSS ≥7.0 was 12.1 and 19.2 years for placebo and ocrelizumab, respectively (7.1‐year delay [95% CI: −4.3 to 18.4]). In MSBase, the median time to 24‐week confirmed EDSS ≥7.0 was 12.4 years.

Conclusions

Compared with placebo, ocrelizumab significantly delayed time to 24‐week confirmed wheelchair requirement in ORATORIO. The plausibility of the extrapolated median time to reach this milestone in the placebo group was supported by observed real‐world data from MSBase. Extrapolated benefits for ocrelizumab over placebo could represent a truly meaningful delay in loss of ambulation and independence.

Issue Information

Issue Information

ORIGINAL ARTICLE

Background and purpose

Regional cerebral blood flow (rCBF) and oxygen metabolism (rCMRO) in whole brain, white matter, gray matter and lenticular nuclei were studied in people living with human immunodeficiency virus (PLHIV) as well as HIV‐associated neurocognitive disorder (HAND).

Methods

Treatment‐naïve PLHIV underwent neurocognitive assessment and magnetic resonance (MR) measurement of rCBF and rCMRO with repeat after 12 months of antiretroviral therapy (ART). Age‐ and sex‐matched controls underwent single MR measurements. Regional CBF and rCMRO were compared amongst symptomatic, asymptomatic, normal HAND and controls using analysis of variance. Longitudinal analysis of HAND worsening (≥1 category) was assessed after 12 months of ART and correlated with rCBF and rCMRO measured by MR imaging using the paired‐sample test.

Results

Thirty PLHIV completed baseline and 12‐month assessments (29 with rCMRO measurement). At baseline HAND assessment, 13% had no cognitive impairment, 27% had asymptomatic neurocognitive impairment, 60% had mild neurocognitive disorder and none had HIV‐associated dementia. At 12 months, 13% had no cognitive impairment, 20% had asymptomatic neurocognitive impairment, 50% had mild neurocognitive disorder and 17% had HIV‐associated dementia. In those without HAND worsening ( = 21) rCMRO remained stable and in those with HAND worsening ( = 8) rCMRO measurement declined from baseline to 12 months in white matter (2.05 ± 0.40 to 1.73 ± 0.51,  = 0.03) and lenticular nuclei (4.32 ± 0.39 to 4.00 ± 0.51,  = 0.05).

Conclusions

In recently diagnosed PLHIV, no association was found between rCBF or rCMRO and cognitive impairment at baseline. There was a reduction in rCMRO in those with worsening of cognitive function at 12 months on ART. Reduction in rCMRO may be a biomarker of cognitive decline in PLHIV.

ORIGINAL ARTICLE

Background and purpose

A rapid response to preventive therapy is of pivotal importance in severely disabled patients with chronic migraine (CM) and diverse preventive treatment failures. This prospective, observational, multicenter real‐life study aimed at investigating the effectiveness of galcanezumab in the first 3 months of treatment of CM patients at 14 Italian headache centers.

Methods

All consecutive adult patients with CM diagnosis with the clinical indication for galcanezumab were considered. We collected patients' baseline characteristics, monthly headache days, monthly painkiller intake, migraine clinical characteristics, and disability scale scores during a 1‐month run‐in period (baseline) and the first 3 months of therapy. Possible predictive factors of treatment were considered.

Results

A total of 156 patients (82.4% female, aged 47.3 ± 12.3 years) were enrolled. The 65 (41.7%) patients with a consecutive ≥50% response rate (RR) in the 3 months of therapy presented a lower body mass index ( = 0.004) and more frequently presented unilateral migraine pain ( = 0.002) and good response to triptans ( = 0.003). Persistent conversion from CM to episodic migraine was observed in 55.8% (87/156) of patients. They more frequently presented a good response to triptans ( = 0.003) and unilateral pain ( = 0.046). At baseline, 131 of 156 (83.9%) patients presented medication overuse (MO). Of these, 61.8% (81/131) no longer displayed MO consistently during the 3 months. These patients were more frequently responders to triptans ( = 0.002) and less frequently suffered from gastrointestinal comorbidity ( = 0.007).

Conclusions

Unilateral pain, good response to triptans, and normal weight may be associated with a persistent positive response in the first 3 months of therapy with galcanezumab in CM patients.

ORIGINAL ARTICLE

Background and purpose

This study was undertaken to investigate the effect of genetic risk on whole brain white matter (WM) integrity in patients with Parkinson disease (PD).

Methods

Data were acquired from the Parkinson's Progression Markers Initiative (PPMI) database. Polygenic load was estimated by calculating weighted polygenic risk scores (PRS) using (i) all available 26 PD‐risk single nucleotide polymorphisms (SNPs) (PRS1) and (ii) 23 SNPs with minor allele frequency (MAF) > 0.05 (PRS2). According to the PRS2, and combined with clinical and diffusion tensor imaging (DTI) data over 3‐year follow‐up, 60 PD patients were screened and assigned to the low‐PRS group ( = 30) and high‐PRS group ( = 30) to investigate intergroup differences in clinical profiles and WM microstructure measured by DTI cross‐sectionally and longitudinally.

RESULTS

PRS were associated with younger age at onset in patients with PD (PRS1, Spearman  = −0.190,  = 0.003; PRS2, Spearman  = −0.189,  = 0.003). The high‐PRS group showed more extensive WM microstructural degeneration compared with the low‐PRS group, mainly involving the anterior thalamic radiation (AThR) and inferior fronto‐occipital fasciculus (IFOF) ( < 0.05). Furthermore, WM microstructural changes in AThR correlated with declining cognitive function ( = −0.401,  = 0.028) and increasing dopaminergic deficits in caudate ( = −0.405,  = 0.030).

Conclusions

These findings suggest that PD‐associated polygenic load aggravates the WM microstructural degeneration and these changes may lead to poor cognition with continuous dopamine depletion. This study provides advanced evidence that combined with a cumulative PRS and DTI methods may predict disease progression in PD patients.

ORIGINAL ARTICLE

Background and purpose

Idiopathic inflammatory myopathy (IIM) can present with dysphagia as a leading or only symptom. In such cases, diagnostic evaluation may be difficult, especially if serological and electromyographical findings are unsuspicious. In this observational study we propose and evaluate a diagnostic algorithm to identify IIM as a cause of unexplained dysphagia.

Methods

Over a period of 4 years, patients with unexplained dysphagia were offered diagnostic evaluation according to a specific algorithm: The pattern of dysphagia was characterized by instrumental assessment (swallowing endoscopy, videofluoroscopy, high‐resolution manometry). Patients with an IIM‐compatible dysphagia pattern were subjected to further IIM‐focused diagnostic procedures, including whole‐body muscle magnetic resonance imaging, electromyography, creatine kinase blood level, IIM antibody panel and, as a final diagnostic step, muscle biopsy. Muscle biopsies were taken from affected muscles. In cases where no other muscles showed abnormalities, the cricopharyngeal muscle was targeted.

Results

Seventy‐two patients presented with IIM‐compatible dysphagia as a leading or only symptom. As a result of the specific diagnostic approach, 19 of these patients were diagnosed with IIM according to the European League Against Rheumatism (EULAR) criteria. Eighteen patients received immunomodulatory therapy as a result of the diagnosis. Of 10 patients with follow‐up swallowing examination, dysphagia improved in three patients after therapy, while it remained at least stable in six patients.

Conclusions

Idiopathic inflammatory myopathy constitutes a potentially treatable etiology in patients with unexplained dysphagia. The diagnostic algorithm presented in this study helps to identify patients with an IIM‐compatible dysphagia pattern and to assign those patients for further IIM‐focused diagnostic and therapeutic procedures.

ORIGINAL ARTICLE

Background and purpose

The faster rates of cognitive decline and predominance of atypical forms in early‐onset Alzheimer's disease (EOAD) suggest that neuropsychiatric symptoms could be different in EOAD compared to late‐onset AD (LOAD); however, prior studies based on non‐biomarker‐diagnosed cohorts show discordant results. Our goal was to determine the profile of neuropsychiatric symptoms in EOAD and LOAD, in a cohort with biomarker/postmortem‐confirmed diagnoses. Additionally, the contribution of co‐pathologies was explored.

Methods

In all, 219 participants (135 EOAD, 84 LOAD) meeting National Institute on Aging and Alzheimer's Association criteria for AD (115 amyloid positron emission tomography/cerebrospinal fluid biomarkers, 104 postmortem diagnosis) at the University of California San Francisco were evaluated. The Neuropsychiatric Inventory—Questionnaire (NPI‐Q) was assessed at baseline and during follow‐up. The NPI‐Q mean comparisons and regression models adjusted by cognitive (Mini‐Mental State Examination) and functional status (Clinical Dementia Rating Sum of Boxes) were performed to determine the effect of EOAD/LOAD and amnestic/non‐amnestic diagnosis on NPI‐Q. Regression models assessing the effect of co‐pathologies on NPI‐Q were performed.

Results

At baseline, the NPI‐Q scores were higher in EOAD compared to LOAD (< 0.05). Longitudinally, regression models showed a significant effect of diagnosis, where EOAD had higher NPI‐Q total, anxiety, motor disturbances and night‐time behavior scores (< 0.05). No differences between amnestics/non‐amnestics were found. Argyrophilic grain disease co‐pathology predicted a higher severity of NPI‐Q scores in LOAD.

Conclusions

Anxiety, night‐time behaviors and motor disturbances are more severe in EOAD than LOAD across the disease course. The differential patterns of neuropsychiatric symptoms observed between EOAD/LOAD could suggest a pattern of selective vulnerability extending to the brain's subcortical structures. Further, co‐pathologies such as argyrophilic grain disease in LOAD may also play a role in increasing neuropsychiatric symptoms.

ORIGINAL ARTICLE

Background and purpose

Guillain–Barré syndrome (GBS) may be fatal in the acute phase but also affect long‐term prognosis due to irreversible sequelae and secondary medical complications. We determined the short‐term, intermediate, and long‐term mortality of GBS compared to the general population.

Methods

Individual‐level data from nationwide registries were linked in this matched cohort study of all first‐time hospital‐diagnosed GBS patients in Denmark between 1987 and 2016 and 10 individuals from the general population, matched on age, sex, and index date. We used Cox regression analysis to calculate matched mortality hazard ratios (HRs) following GBS, assessing short‐term (0–6 months), intermediate (>6 months–4 years), and long‐term (>4 years) mortality.

Results

We identified 2414 patients with GBS and 23,909 matched individuals from the general population. Short‐term mortality was 4.8% (95% confidence interval [CI] = 4.0–5.8) and 0.8% (95% CI = 0.7–0.9) for GBS patients and general population members, respectively, resulting in an HR of 6.6 (95% CI = 4.0–5.8). Intermediate mortality was 7.6% (95% CI = 6.5–8.9), compared with 5.8% (95% CI = 5.5–6.1) for general population members, corresponding to an HR of 1.5 (95% CI = 1.3–1.8). After the first 4 years, long‐term mortality showed similar results for GBS patients and general population members (HR = 1.1, 95% CI = 0.9–1.2).

Conclusions

During the first 6 months after GBS hospital admission, GBS was associated with a 6.6‐fold increased mortality as compared with the background population of the same age. Mortality remained increased for approximately 4 years following GBS, and then leveled off to a similar long‐term mortality rate.

ORIGINAL ARTICLE

Background and purpose

Stroke‐related restless legs syndrome (sRLS) secondary to ischemic lesions is an emerging entity and an interesting condition, but there are limited available data to help us further understand its underlying pathways. In this study, we characterized sRLS clinically, neuroanatomically and functionally.

Methods

Consecutive patients hospitalized in the Stroke Unit of the University Hospital of Strasbourg were assessed clinically and electrophysiologically for sRLS characteristics. They underwent brain magnetic resonance imaging for the neuroanatomical study of involved structures, and received functional evaluations with F‐FDG (2‐deoxy‐2‐[fluorine‐18]fluoro‐D‐glucose) positron emission tomography (PET) for glucose consumption, I‐FP‐CIT ([123]I‐2beta‐carbometoxy‐3beta‐[4‐iodophenyl]‐N‐[3‐fluoropropyl]nortropane) single‐photon emission computed tomography for dopamine reuptake and PET with F‐FDOPA ((3,4‐dihydroxy‐6‐[18]F‐fluoro‐l‐phenylalanine) for presynaptic dopaminergic synthesis.

Results

Sixteen patients with sRLS, eight women and eight men, aged 41–81 years, were included. The clinical characteristics of sRLS and idiopathic RLS were similar. Most patients presented with bilateral and symmetric RLS. Eight patients had infarction in the lenticulostriate area (middle cerebral artery and internal carotid arteria). The body of the caudate nucleus was most commonly affected. Seven patients had sRLS secondary to ventral brainstem infarction (perforating branches of the basilar arteria) affecting the pons in six patients and the medulla oblongata in one patient. Both the corticospinal tract and the cortico‐pontocerebellar fibres were lesioned in all patients with brainstem stroke. One patient had infarction in the left posterior cerebellar vermis and occipital area (posterior cerebral artery and superior cerebellar artery). Isotopic explorations showed a significantly increased dopaminergic tone in the striatum ipsilateral to lenticulostriate infarction. Dopamine fixation was normal in patients with stroke outside of the lenticulostriate area.

Conclusions

Clinicians should be aware of the characteristics of sRLS for the appropriate diagnosis and treatment of this condition.

REVIEW ARTICLE

Background and purpose

Lewy body dementia (LBD), including dementia with Lewy bodies (DLB) and Parkinson's disease dementia, is a common form of neurodegenerative dementia. The frequency and influence of comorbid cerebrovascular disease is not understood but has potentially important clinical management implications.

Methods

A systematic literature search was conducted (MEDLINE and Embase) for studies including participants with DLB and/or Parkinson's disease dementia assessing cerebrovascular lesions (imaging and pathological studies). They included white matter changes, cerebral amyloid angiopathy, cerebral microbleeds (CMB), macroscopic infarcts, microinfarcts and intracerebral haemorrhage.

Results

Of 4411 articles, 63 studies were included. Cerebrovascular lesions commonly studied included white matter changes (41 studies) and CMB (18 studies). There was an increased severity of white matter changes on magnetic resonance imaging (visualized as white matter hyperintensities), but not neuropathology, in LBD compared to Parkinson's disease without dementia and age‐matched controls. CMB prevalence in DLB was highly variable but broadly similar to Alzheimer's disease (0%–48%), with a lobar predominance. No relationship was found between large cortical or small subcortical infarcts or intracerebral haemorrhage and the presence of LBD.

Conclusion

The underlying mechanisms of white matter hyperintensities in LBD require further exploration, as their increased severity in LBD was not supported by neuropathological examination of white matter. CMB in LBD had a similar prevalence to Alzheimer's disease. There is a need for larger studies assessing the influence of cerebrovascular lesions on clinical symptoms, disease progression and outcomes.

ORIGINAL ARTICLE

Background and purpose

Guillain–Barré syndrome (GBS) is an acute inflammatory autoimmune and demyelinating disease of the peripheral nervous system. Currently, valid biomarkers are unavailable for the diagnosis of GBS.

Methods

A comparative proteomics analysis was performed on the cerebrospinal fluid (CSF) from 10 patients with GBS and 10 patients with noninflammatory neurological disease (NND) using the tandem mass tags technique. The differentially expressed proteins were analyzed by bioinformatics, and then the candidate proteins were validated by the enzyme‐linked immunosorbent assay method in another cohort containing 160 samples (paired CSF and plasma of 40 patients with GBS, CSF of 40 NND patients and plasma of 40 healthy individuals).

Results

In all, 298 proteins were successfully identified in the CSF samples, of which 97 differentially expressed proteins were identified in the GBS and NND groups. Three key molecules were identified as candidate molecules for further validation. The CSF levels of TGOLN2 and NCAM1 decreased in GBS patients compared with NND patients, whereas the CSF levels of APOC3 increased. The enzyme‐linked immunosorbent assay results were consistent with our proteomics analysis. Interestingly, in the validation cohort, serum APOC3 levels in the GBS group were consistent with those in the CSF samples and significantly higher than those in the healthy control group.

Conclusions

Our preliminary data suggest that the CSF protein expression profile of patients with GBS is different from that of patients with NND. Moreover, alterations of TGOLN2, NCAM1and APOC3 may be used as novel biomarkers for identifying patients with GBS.

ORIGINAL ARTICLE

Background and purpose

The aim was to compare the effectiveness and safety of intravenous immunoglobulin (IVIg) or intravenous methylprednisolone (IVMP) versus IVIg plus IVMP (IPI) as initial therapy in anti‐‐methyl‐‐aspartate receptor (NMDAR) encephalitis.

Methods

This was a multicenter study of prospectively identified NMDAR encephalitis individuals who presented from October 2011 to August 2020 to the study hospitals of western China, with a median follow‐up of 3.9 years. Prespecified candidate variables were the prescriptions of IVIg, IVMP or IPI. Propensity score matching was also performed to control potential confounders.

Results

A total of 347 NMDAR encephalitis patients were finally analyzed in this study. After TriMatch for NMDAR encephalitis, 37 triplets were generated. Compared to IVIg or IVMP, the administration of IPI exhibited a significant benefit of a higher response rate (86.5% vs. 55.6% vs. 68.7%, < 0.01), improved modified Rankin Scale score at 3, 6 and 12 months ( < 0.05), and reduced further recurrence rate (10 of 37 [27.0%] vs. 9 of 37 [24.3%] vs. 2 of 37 [5.4%]; log rank = 0.01). There was no association between treatment superiority and patient sex or the presence of tumors ( ≥ 0.05). Patients treated with IVMP had a significantly higher number of adverse events, but 99% of adverse events were mild to moderate and did not lead to a change in treatment.

Conclusion

In patients with NMDAR encephalitis, adequate response, favorable outcome and less recurrence were each more likely to occur in individuals treated with a combined immunotherapy than in monotherapy individuals.

SHORT COMMUNICATION

Background and purpose

Cognitive decline is a recognized manifestation of long COVID, even among patients who experience mild disease. However, there is no evidence regarding the length of cognitive decline in these patients. This study aimed to assess whether COVID‐19‐related cognitive decline is a permanent deficit or if it improves over time.

Methods

Cognitive performance was evaluated by means of the Montreal Cognitive Assessment (MoCA) in COVID‐19 survivors and noninfected individuals. All study participants had four cognitive evaluations, two of them before the pandemic and the other two, 6 and 18 months after the initial SARS‐CoV‐2 outbreak infection in the village. Linear mixed effects models for longitudinal data were fitted to assess differences in cognitive performance across COVID‐19 survivors and noninfected individuals.

Results

The study included 78 participants, 50 with history of mild COVID‐19 and 28 without. There was a significant—likely age‐related—decline in MoCA scores between the two prepandemic tests (β = −1.53, 95% confidence interval [CI] = −2.14 to −0.92,  < 0.001), which did not differ across individuals who later developed COVID‐19 when compared to noninfected individuals. Six months after infection, only COVID‐19 survivors had a significant decline in MoCA scores (β = −1.37, 95% CI = −2.14 to −0.61,  < 0.001), which reversed after 1 additional year of follow‐up (β = 0.66, 95% CI = −0.11 to 1.42,  = 0.092). No differences were noticed among noninfected individuals when both postpandemic MoCA scores were compared.

Conclusions

Study results suggest that long COVID‐related cognitive decline may spontaneously improve over time.

REVIEW ARTICLE

Background and purpose

The scientific literature on COVID‐19 is increasingly growing.

Methods

In this paper, we review the literature on movement disorders in the context of the COVID‐19 pandemic.

Results

First, there are a variety of transient movement disorders that may manifest in the acute phase of COVID‐19, most often myoclonus, with more than 50 patients described in the literature. New onset parkinsonism, chorea, and tic‐like behaviours have also been reported. Movement disorders as a side effect after COVID‐19 vaccination are rare, occurring with a frequency of 0.00002–0.0002 depending on the product used, mostly manifesting with tremor. Current evidence for potential long‐term manifestations, for example, long COVID parkinsonism, is separately discussed. Second, the pandemic has also had an impact on patients with pre‐existing movement disorder syndromes, with negative effects on clinical status and overall well‐being, and reduced access to medication and health care. In many parts, the pandemic has led to reorganization of the medical system, including the development of new digital solutions. The movement disorder‐related evidence for this is reviewed and discussed.

Conclusions

The pandemic and the associated preventive measures have had a negative impact on the clinical status, access to health care, and overall well‐being of patients with pre‐existing movement disorders.

ORIGINAL ARTICLE

Background and purpose

Muscular A‐type lamin‐interacting protein (MLIP) is most abundantly expressed in cardiac and skeletal muscle. and animal studies have shown its regulatory role in myoblast differentiation and in organization of myonuclear positioning in skeletal muscle, as well as in cardiomyocyte adaptation and cardiomyopathy. We report the association of biallelic truncating variation in the gene with human disease in five individuals from two unrelated pedigrees.

Methods

Clinical evaluation and exome sequencing were performed in two unrelated families with elevated creatine kinase level.

Results

. A 6‐year‐old girl born to consanguineous parents of Arab‐Muslim origin presented with myalgia, early fatigue after mild‐to‐moderate physical exertion, and elevated creatine kinase levels up to 16,000 U/L. Exome sequencing revealed a novel homozygous nonsense variant, c.2530C>T; p.Arg844Ter, in the  gene. . Three individuals from two distantly related families of Old Order Amish ancestry presented with elevated creatine kinase levels, one of whom also presented with abnormal electrocardiography results. On exome sequencing, all showed homozygosity for a novel nonsense variant c.1825A>T; p.Lys609Ter. Another individual from this pedigree, who had sinus arrhythmia and for whom creatine kinase level was not available, was also homozygous for this variant.

Conclusions

Our findings suggest that biallelic truncating variants in result in myopathy characterized by hyperCKemia. Moreover, these cases of ‐related disease may indicate that at least in some instances this condition is associated with muscle decompensation and fatigability during low‐to‐moderate intensity muscle exertion as well as possible cardiac involvement.

ORIGINAL ARTICLE

Background and purpose

The roles of blood low‐density lipoprotein cholesterol (LDL‐C), high‐density lipoprotein cholesterol (HDL‐C) and triglycerides in the development of post‐stroke dementia remain uncertain. This study was to investigate their potential associations.

Methods

A retrospective cohort study was conducted using the Clinical Practice Research Datalink. Patients with first‐ever stroke but no prior dementia were followed up for 10 years. Cox regression was used to examine the association of baseline LDL‐C, HDL‐C and triglycerides with post‐stroke dementia.

Results

Amongst 63,959 stroke patients, 15,879 had complete baseline data and were included in our main analysis. 10.8% developed dementia during a median of 4.6 years of follow‐up. The adjusted hazard ratio of dementia for LDL‐C (per log mmol/l increase) was 1.29 (95% confidence interval [CI] 1.14–1.47), with a linear increasing trend ( trend <0.001). The counterpart for triglycerides was 0.79 (95% CI 0.69–0.89), with a linear decreasing trend ( trend <0.001). For HDL‐C, there was no association with dementia (adjusted hazard ratio 0.89, 95% CI 0.74–1.08) or a linear trend ( trend = 0.22).

Conclusions

Blood lipids may affect the risk of post‐stroke dementia in different ways, with higher risk associated with LDL‐C, lower risk associated with triglycerides, and no association with HDL‐C.

ORIGINAL ARTICLE

Background and purpose

Treatment success in relapsing–remitting multiple sclerosis (RRMS) is generally determined using relapse frequency and magnetic resonance imaging (MRI) activity in the first 6 or 12 months on treatment. The association of these definitions of short‐term treatment success with disability worsening and disease activity in the longer term is unclear. In this study, we investigated risk factors associated with early first‐line treatment failure in RRMS, and the association of early treatment failure with subsequent disability worsening or "no evidence of disease activity" (NEDA‐3) status.

Methods

We used data from CombiRx (clinicaltrials.gov identifier NCT00211887) to investigate risk factors associated with early treatment failure, and the association of early treatment failure at 6 and 12 months with subsequent disability worsening or NEDA‐3 at 36 months.

Results

CombiRx included 1008 treatment‐naïve participants with RRMS, who were randomly assigned to treatment with glatiramer acetate, interferon beta, or the combination of both. Early treatment failure at 6 or 12 months by several definitions was associated with NEDA‐3 failure at 36 months, but not with subsequent disability worsening at 36 months. Expanded Disability Status Scale (EDSS) was the only baseline characteristic associated with the risk of disability worsening at 36 months. Approximately 70% of NEDA‐3 failures occurred due to MRI activity, and <10% occurred due to EDSS worsening.

Conclusions

Our investigation shows that current definitions of early treatment failure in RRMS are unrelated to patient‐relevant disability worsening at 36 months of follow‐up. Further research into useful definitions of treatment success and failure in RRMS is needed.

SHORT COMMUNICATION

Background and purpose

Augmentation is a paradoxical reaction mainly to dopaminergic medication in patients with restless legs syndrome (RLS), but the exact pathomechanism remains unclear. The aim of this study was to identify factors associated with augmentation in RLS patients.

Methods

RLS patients with and without current or previous augmentation were recruited. Demographic characteristics, history of smoking, questionnaires for depression, alexithymia, and impulsivity, and RLS severity were obtained.

Results

We included 122 patients, of whom half had a history of augmentation. Patients with augmentation had a longer disease duration ( = 0.001), had higher RLS severity scores ( = 0.013), had higher levodopa equivalent doses ( < 0.001), had higher scores for alexithymia ( = 0.028), had higher prevalence of impulse control disorders ( < 0.001), more often had a history of smoking ( = 0.039), were more often currently smoking ( = 0.015), and had more average pack‐years ( = 0.016).

Conclusions

Here, we describe several factors commonly associated with augmentation in RLS. These may help clinicians to screen and treat patients carefully to avoid the challenging side effect of augmentation.

ORIGINAL ARTICLE

Background and purpose

Data on interruption of enzyme replacement therapy (ERT) are scarce in late onset Pompe disease. Due to the COVID‐19 crisis, eight neuromuscular reference centers in France were obligated to stop the treatment for 31 patients.

Methods

We collected the motor and respiratory data from our French registry, before COVID‐19 and at treatment restart.

Results

In 2.2 months (mean), patients showed a significant deterioration of 37 m (mean) in the 6‐min walk test and a loss of 210 ml (mean) of forced vital capacity, without ad integrum restoration after 3 months of ERT restart.

Conclusions

This national study based on data from the French Pompe Registry shows that the interruption of ERT, even as short as a few months, worsens Pompe patients' motor and respiratory function.

CASE REPORT

Background and purpose

With the advent of gene therapies for amyotrophic lateral sclerosis (ALS), the importance of gene testing in ALS is increasing. This will likely lead to the identification of new variants for which the pathogenicity is not established. We aimed to study the pathogenicity of a newly identified variant in superoxide dismutase 1 ().

Methods

Gene testing was performed using Sanger sequencing. SOD1 activity in erythrocytes was measured using spectrophotometry. Postmortem brain and spinal cord sections were stained with antibodies against phospho‐TDP‐43 and SOD1.

Results

We identified a novel c.416G>T (p.Gly139Val) mutation in , which caused a rapidly progressive respiratory onset form of ALS. The mutation resulted in a 50% drop of SOD1 activity. Postmortem examination confirmed the absence of TDP‐43 pathology and displayed typical SOD1 inclusions in remaining motor neurons, confirming the pathogenic nature of the mutation.

Conclusions

Novel variants of unknown pathogenicity will be identified as a result of a surge in gene testing in people with ALS. An in‐depth study of a newly identified p.Gly139Val mutation in confirmed the pathogenicity of this mutation. Future patients with this particular mutation should qualify for silencing or editing therapies.

ERRATUM

 

ORIGINAL ARTICLE

Background and purpose

Studies have not yet found conclusive results on the risk of cancer in patients with multiple sclerosis (MS). This study aimed to compare the incidence of all cancers and of specific types of cancer between MS patients and the general population by age and by sex.

Methods

All prevalent MS patients identified between 2008 and 2014 in the nationwide French health care database (Système National des Données de Santé) and without history of malignancy were included in a cohort study and followed up until cancer occurrence, date of death, or 31 December 2015, whichever came first. MS patients were matched based on sex and year of birth to non‐MS controls from the general population without cancer before index date. Incidence rate was reported per 100,000 person‐years (PY), and risk of cancer was estimated by type of cancer, age, and sex using a Cox model (hazard ratio [HR] and its 95% confidence interval [CI]).

Results

Overall, 576 cancers per 100,000 PY were observed in MS patients versus 424 per 100,000 PY in the control population. The risk of cancer was higher among MS patients than among population controls whether considered overall (HR = 1.36, 95% CI = 1.29–1.43) or for prostate (HR = 2.08, 95% CI = 1.68–2.58), colorectal and anal (HR = 1.35, 95% CI = 1.16–1.58), trachea, bronchus, and lung (HR = 2.36, 95% CI = 1.96–2.84), and to a lesser extent, breast cancer (HR = 1.12, 95% CI = 1.03–1.23).

Conclusions

MS patients were associated with increased risk of cancer compared to population controls.

ORIGINAL ARTICLE

Background and purpose

The transition from relapsing–remitting to secondary progressive multiple sclerosis (SPMS) is not well defined. Different definitions and tools to identify SPMS have been proposed. Meanwhile, early diagnosis of “active” SPMS is getting progressively more important as pharmaceutical treatment options are developed. In this study, we compared different classification methods regarding their accuracy to reliably identify “active SPMS.”

Methods

Independent from previous diagnostic classification, we descriptively analyzed the disease course (regarding relapses, progression, and magnetic resonance imaging activity) in 208 consecutive multiple sclerosis (MS) patients treated in our MS outpatient clinic in 2018. Patients were reclassified according to different SPMS criteria and tools. Diagnostic accuracy in identifying patients with “active SPMS” was determined.

Results

Comparing the tools to each other, significant variability in the number of patients identified as having SPMS as well as in the proportion of these patients having “active SPMS” was noted. Applying both diagnostic criteria “SPMS” and “active disease” reduced the sensitivity in identifying patients with active progressive disease in all approaches.

Conclusions

We propose lessening the emphasis on the label “SPMS” in favor of the more open term “active progressive disease” to simplify the process of identifying patients who may benefit from immune therapy.

SHORT COMMUNICATION

Background and purpose

Creutzfeldt–Jakob disease (CJD) is lethal and transmissible. We assessed the impact of the COVID‐19 pandemic on UK CJD surveillance. We hypothesized that (i) disruptions prolonged diagnostic latency; (ii) autopsy rates declined; and (iii) COVID‐19 infection negatively affected diagnosis, care, and survival.

Methods

We retrospectively investigated the first year of the pandemic, using the preceding year as a comparator, quantifying numbers of individuals assessed by the UK National CJD Research & Surveillance Unit for suspected CJD, time to diagnosis, disease duration, and autopsy rates. We evaluated the impact of COVID‐19 status on diagnosis, care, and survival in CJD.

Results

A total of 148 individuals were diagnosed with CJD in the pandemic (from a total of 166 individuals assessed) compared to 141 in the comparator (from 145 assessed). No differences were identified in disease duration or time to diagnosis. Autopsy rates were unchanged. Twenty individuals had COVID‐19; 60% were symptomatic, and 10% had severe disease. Disruptions in diagnosis and care were frequently identified. Forty percent of COVID‐19‐positive individuals died; however, COVID‐19 status did not significantly alter survival duration in CJD.

Conclusions

The COVID‐19 pandemic has not impacted UK CJD case ascertainment or survival, but diagnostic evaluation and clinical care of individuals have been affected.

ORIGINAL ARTICLE

Background and purpose

Intraplaque hemorrhage is a key feature of vulnerable carotid atherosclerotic plaque (CAP), associated with low densities (<25 Hounsfield units [HU]) on computed tomographic angiography (CTA). This study aimed to analyze CAP on routine CTA performed in patients with symptomatic and asymptomatic carotid stenosis undergoing carotid endarterectomy (CEA) by assessing HU of the CAP area showing the lowest density (CAPALD) using radiological tools available in daily clinical practice, and to compare CAPALD values between symptomatic and asymptomatic carotids.

Methods

We retrospectively screened preoperative CTA scans of 206 consecutive adult patients undergoing CEA for symptomatic or asymptomatic stenosis. CAPALD values were compared between symptomatic and asymptomatic carotids. Asymptomatic carotids included arteries contralateral to the symptomatic CEA artery, and asymptomatic stenotic arteries undergoing CEA and their contralateral arteries. Carotids were excluded when there was <30% stenosis, or when CAP could not be identified or CAPALD could not be measured.

Results

In total, 95 symptomatic and 112 asymptomatic carotids (derived from 174 patients) were analyzed. In multivariate analysis, symptomatic arteries showed more severe stenosis (median 70% vs. 67%,  = 0.0228) and lower CAPALD values (median 17 vs. 25 HU,  = 0.049), whereas degree of stenosis and CAPALD values were not correlated (rho = −0.02,  = 0.77). HU values of <25 were more frequent in symptomatic than asymptomatic carotids (68% vs. 47%,  = 0.0022).

Conclusions

On CTA, symptomatic carotids are associated with CAP areas with low densities. CTA analysis of CAP may be interesting to help identify vulnerable plaques at risk for future stroke, especially in patients lacking strict indications for CEA based on the current guidelines.

ORIGINAL ARTICLE

Background and purpose

Variants in the glucocerebrosidase () gene are recognized as a common and important genetic risk factor for Parkinson disease (PD). However, the impact of variant severity on the clinical phenotype of PD in the Chinese population remains unclear. Thus, the present study aimed to determine the frequency of ‐related PD (‐PD) and the relationship of variant severity with clinical characteristics in a large Chinese cohort.

Methods

Long‐range polymerase chain reaction and next generation sequencing were performed for the entire gene. variant severity was classified into five classes: mild, severe, risk, complex, and unknown.

Results

Among the total 737 PD patients, 47  variants were detected in 79 (10.72%) patients, and the most common variants were R163Q, L444P, and R120W. Complete demographic and clinical data were obtained for 673 patients, which revealed that 18.50% of early onset PD patients had variants. Compared with patients without variants, ‐PD patients experienced PD onset an average of 4 years earlier and had more severe motor and nonmotor symptoms. Patients carrying severe and complex variants had a higher burden of nonmotor symptoms, especially depression, and more mood/cognitive and gastrointestinal symptoms than patients carrying mild variants.

Conclusions

‐PD is highly prevalent in the Chinese population. The severity of variants underlies distinct phenotypic spectrums, with PD patients carrying severe and complex variants seeming to have similar phenotypes. PD patient stratification by variant severity should become a prerequisite for selecting specific treatments.

CASE REPORT

Background and purpose

To provide further evidence for sirolimus, a mammalian target of rapamycin inhibitor, as a treatment strategy for patients with inclusion body myositis (IBM).

Methods

We acquired longitudinal clinical data and immunological assessments of CD8 T‐cell subsets in peripheral blood for evaluation of potential anti‐inflammatory treatment effects of sirolimus.

Results

Therapy with sirolimus 2 mg/day by mouth led to rapid and sustained clinical improvement of motor symptoms for an observation period of more than 1 year. Treatment was well tolerated, with no occurrence of adverse effects. We did not observe a meaningful alteration of CD8 T‐cell subsets in our patient after 9 and 12 months compared to baseline.

Conclusions

The significant and persistent clinical improvement highlights the use of sirolimus as a potential treatment option in patients with IBM. In light of the lack of immunological treatment effects observed for cytotoxic CD8 T cells, further studies should investigate the potential myoprotective effects of sirolimus.

ORIGINAL ARTICLE

Objectives

Multiple system atrophy (MSA) is a rare fatal neurodegenerative disease characterized by parkinsonism, cerebellar ataxia and autonomic failure. This study was aimed at investigating possible associations between mortality, 24‐h blood pressure (BP) level and variability, and drug treatments for orthostatic hypotension (OH) in MSA patients.

Methods

A total of 129 patients followed at the French Reference Center for MSA who underwent routine 24‐h ambulatory BP monitoring were included. Unified MSA Rating Scale (UMSARS) scores, drug treatments and the occurrence and cause of death were recorded.

Results

Seventy patients died during follow‐up (2.9 ± 1.8 years), mainly from terminal illness, pulmonary or sudden death. Multivariate Cox regression analysis, after adjustment for gender, disease duration and severity (UMSARS I+II score), showed that increased daytime systolic BP variability, OH severity and OH drug treatment were independently correlated with mortality. OH treatment was associated with the risk of cardiac causes and/or sudden death ( = 0.01). In a fully adjusted model, male gender [(female vs. male) hazard ratio (HR) 0.56, 95% CI 0.34–0.94,  = 0.03], UMSARS I+II score (HR 1.04, 95% CI 1.02–1.06,  < 0.01), systolic BP daytime variability (HR 3.66, 95% CI 1.46–9.17,  < 0.01) and OH treatment (HR: 2.13, 95% CI 1.15–3.94,  = 0.02) predicted mortality.

Conclusions

Increased daytime BP variability and OH treatment were predictive of mortality in patients with MSA, independently from disease severity. Further studies are required to assess if these associations are explained by more severe autonomic dysfunction or if OH treatment exposes per se to a specific risk in this population.

ORIGINAL ARTICLE

Background and purpose

Accumulating evidence indicates that dynamic amplitude of low‐frequency fluctuations (dALFF) or dynamic functional connectivity (dFC) can provide complementary information, distinct from static amplitude of low‐frequency fluctuations (sALFF) or static functional connectivity (sFC), in detecting brain functional abnormalities in brain diseases. We aimed to examine whether dALFF and dFC can offer valuable information for the detection of functional brain abnormalities in patients with blepharospasm.

Methods

We collected resting‐state functional magnetic resonance imaging data from 46 patients each of blepharospasm, hemifacial spasm (HFS), and healthy controls (HCs). We examined intergroup differences in sALFF and dALFF to investigate abnormal regional brain activity in patients with blepharospasm. Based on the dALFF results, we conducted seed‐based sFC and dFC analyses to identify static and dynamic connectivity changes in brain networks centered on areas showing abnormal temporal variability of local brain activity in patients with blepharospasm.

Results

Compared with HCs, patients with blepharospasm displayed different brain functional change patterns characterized by increased sALFF in the left primary motor cortex (PMC) but increased dALFF variance in the right PMC. However, differences were not found between patients with HFS and HCs. Additionally, patients with blepharospasm exhibited decreased dFC strength, but no change in sFC, between right PMC and ipsilateral cerebellum compared with HCs; these findings were replicated when patients with blepharospasm were compared to those with HFS.

Conclusions

Our findings highlight that dALFF and dFC are complementary to sALFF and sFC and can provide valuable information for detecting brain functional abnormalities in blepharospasm. Blepharospasm may be a network disorder involving the cortico‐ponto‐cerebello‐thalamo‐cortical circuit.

ORIGINAL ARTICLE

Background and purpose

Levodopa‐induced dyskinesia (LID) is a common motor complication in patients with Parkinson's disease (PD). Although amantadine is indicated for LID treatment, it is uncertain whether early treatment with amantadine reduces the risk of LID in patients with PD. We aimed to evaluate the association between amantadine treatment and LID onset in patients with early‐stage PD.

Methods

This was a hospital‐based retrospective cohort study that used electronic medical records from January 1, 2009 to October 31, 2016. The effect of amantadine on LID onset was compared with those of anticholinergics and monoamine oxidase type B inhibitors in patients with PD. Propensity‐score weighting and landmark analysis were used to reduce potential confounding. The time to LID onset was analyzed using Cox models. Sensitivity analyses were performed to determine the robustness of the results.

Results

The analyses included 807, 661, and 518 patients at 6‐, 12‐, and 18‐month landmark points, respectively. Amantadine use was associated with delayed LID onset in the 6‐ and 12‐month landmark analyses, with adjusted hazard ratios of 0.65 (95% confidence interval [CI] = 0.49–0.86) and 0.64 (95% CI = 0.47–0.88), respectively. Sensitivity analysis findings were comparable to those of the main analysis.

Conclusions

Early treatment with amantadine may delay LID onset more than treatment with other symptomatic agents. Further studies are needed to elucidate the mechanism of amantadine in LID onset delay and to validate our findings.

REVIEW ARTICLE

Background and purpose

Portable and wearable devices can monitor a number of physical performances and lately have been applied to patients with neuromuscular disorders (NMDs).

Methods

We performed a systematic search of literature databases following PRISMA (Preferred Reporting Items for Systematic Reviews and Meta‐Analyses) principles, including all studies reporting the use of technological devices for motor function assessment in NMDs from 2000 to 2021. We also summarized the evidence on measurement properties (validity, reliability, responsiveness) of the analyzed technological outcome measures.

Results

One hundred studies fulfilled the selection criteria, most of them published in the past 10 years. We defined four categories that gathered similar technologies: gait analysis tools, for clinical assessment of pace and posture; continuous monitoring of physical activity with inertial sensors, which allow “unsupervised” activity assessment; upper limb evaluation tools, including Kinect‐based outcome measures to assess the reachable workspace; and new muscle strength assessment tools, such as Myotools. Inertial sensors have the evident advantage of being applied in the “in‐home” setting, which has become especially appealing during the COVID‐19 pandemic, although poor evidence from psychometric property assessment and results of the analyzed studies may limit their research application. Both Kinect‐based outcome measures and Myotools have already been validated in multicenter studies and different NMDs, showing excellent characteristics for application in clinical trials.

Conclusions

This overview is intended to raise awareness on the potential of the different technology outcome measures in the neuromuscular field and to be an informative source for the design of future clinical trials, particularly in the era of telemedicine.

ORIGINAL ARTICLE

Background and purpose

Erenumab (ERE) is the first anticalcitonin gene‐related peptide receptor monoclonal antibody approved for migraine prevention. A proportion of patients do not adequately respond to ERE.

Methods

Prospective multicenter study involving 110 migraine patients starting ERE 70 mg monthly. Baseline socio‐demographics and migraine characteristics, including mean monthly migraine days (MMDs), migraine‐related burden (MIDAS [Migraine Disability Assessment scale] and Headache Impact Test‐6), and use of abortive medications, during 3 months before and after ERE start were collected. Real‐time polymerase chain reaction was used to determine polymorphic variants of calcitonin receptor‐like receptor and receptor activity‐modifying protein‐1 genes. Logistic regression models were used to identify independent predictors for 50% responder patients (50‐RESP) and 75% responder patients (75‐RESP).

Results

At month 3, MMDs decreased from 17.2 to 9.2 ( < 0.0001), 59/110 (53.6%) patients were 50‐RESP, and 30/110 (27.3%) were 75‐RESP. Age at migraine onset (odds ratio [OR] [95% confidence interval (95% CI)]: 1.062 [1.008–1.120],  = 0.024), number of failed preventive medications (0.753 [0.600–0.946],  = 0.015), and MIDAS score (1.011 [1.002–1.020],  = 0.017) were associated with 75‐RESP. Among the genetic variants investigated, was found associated with a lower probability of being 75‐RESP (per G allele OR [95% CI]: 0.53 [0.29–0.99],  = 0.048]), but this association did not survive adjustment for confounding clinical variables (per G allele, 0.55 [0.28–1.10],  = 0.09]).

Conclusions

In this real‐word study, treatment with ERE significantly reduced MMDs. The number of failed preventive medications, migraine burden, and age at migraine onset predicted response to ERE. Larger studies are required to confirm a possible role of as genetic predictor of ERE efficacy.

ORIGINAL ARTICLE

Background and purpose

The aim was to evaluate urinary neopterin, a marker of pro‐inflammatory state, as a potential biomarker of disease prognosis and progression in amyotrophic lateral sclerosis (ALS); and to compare its utility to urinary neurotrophin receptor p75 extracellular domain (p75).

Methods

This was an observational study including 21 healthy controls and 46 people with ALS, 29 of whom were sampled longitudinally. Neopterin and p75 were measured using enzyme‐linked immunoassays. Baseline and longitudinal changes in clinical measures, neopterin and urinary p75 were examined, and prognostic utility was explored by survival analysis.

Results

At baseline, urinary neopterin was higher in ALS compared to controls (181.7 ± 78.9 μmol/mol creatinine vs. 120.4 ± 60.8 μmol/mol creatinine,  = 0.002, Welch's test) and correlated with the Revised ALS Functional Rating Scale ( = −0.36,  = 0.01). Combining previously published urinary p75 results from 22 ALS patients with a further 24 ALS patients, baseline urinary p75 was also higher compared to healthy controls (6.0 ± 2.7 vs. 3.2 ± 1.0 ng/mg creatinine,  < 0.0001) and correlated with the Revised ALS Functional Rating Scale ( = −0.36,  = 0.01). Urinary neopterin and p75 correlated with each other at baseline ( = 0.38,  = 0.009). In longitudinal analysis, urinary neopterin increased on average (±SE) by 6.8 ± 1.1 µmol/mol creatinine per month ( < 0.0001) and p75 by 0.19 ± 0.02 ng/mg creatinine per month ( < 0.0001) from diagnosis in 29 ALS patients.

Conclusion

Urinary neopterin holds promise as marker of disease progression in ALS and is worthy of future evaluation for its potential to predict response to anti‐inflammatory therapies.

ORIGINAL ARTICLE

Background and purpose

Anti‐acetylcholine receptor (AChR) antibodies (ab) in the serum are detected in most patients with generalized myasthenia gravis (MG) and used as a diagnostic tool. The aim of this study was to analyse a possible association between anti‐AChR‐ab serum levels and clinical improvement of MG.

Methods

The Maastricht University Medical Center is a centre of expertise for the treatment of MG. Between 1997 and 2020, more than 4000 anti‐AChR‐ab blood samples were measured for clinical care using a quantitative radioimmunoassay technique. These results, in combination with clinical status obtained from the patients’ electronic patient files, were retrospectively analysed by a single blinded clinician. Symptoms of MG were classified using the Myasthenia Gravis Foundation of America (MGFA) scale.

Results

In total, 90 anti‐AChR‐ab‐positive MG patients with 837 blood samples were included. The median follow‐up time was 72 months. The majority of the included patients were women (61.1%), were on immunosuppressive drug therapy (88.9%), and underwent a thymectomy (54.4%). Multilevel logistic regression analysis showed a significantly inverse association between change in anti‐AChR‐ab level and the odds of MGFA improvement (per 10% decrease of anti‐AChR‐ab level: odds ratio 1.21, 95% confidence interval 1.12–1.31;  < 0.001).

Conclusions

A change in anti‐AChR‐ab serum level is associated with clinical status in patients with MG. Analyses of anti‐AChR‐ab are not only useful for diagnostics but also in follow‐up of adult symptomatic patients with MG. The use of repetitive anti‐AChR‐ab serum levels might be valuable in long‐term monitoring for clinical improvement in patients with MG, however, further research is required for specific recommendations.

ORIGINAL ARTICLE

Background and purpose

This study aimed to evaluate the clinical characteristics and prognosis of late onset (≥50 years) neuromyelitis optica spectrum disorder (LO‐NMOSD), and compare them with those of early onset (<50 years) NMOSD (EO‐NMOSD) and NMOSD with various antibody serostatuses.

Methods

From January 2015 to December 2020, 360 anti‐aquaporin 4 antibody (AQP4‐ab)‐positive and 130 anti‐myelin oligodendrocyte glycoprotein antibody (MOG‐ab)‐positive patients presented to the Huashan Hospital, China. We retrospectively reviewed their medical records, including the Expanded Disability Status Scale (EDSS) score at each visit and the annualized relapse rate (ARR). Prognostic outcomes included the time to first relapse, blindness, motor dysfunction, severe motor dysfunction, and death. Correlations between the age at onset, lesion location, and clinical parameters were analyzed.

Results

This study included 122 (24.9%) patients with LO‐NMOSD, 101 with AQP4‐ab and 21 with MOG‐ab. Compared with EO‐NMOSD patients, those with LO‐NMOSD had higher EDSS scores and more frequent disease onset with transverse myelitis, blindness, motor dysfunction, and severe motor dysfunction. Compared with LO‐NMOSD patients with MOG‐ab, those with AQP4‐ab had a worse prognosis. Age at disease onset had a significantly positive correlation with EDSS score at the last follow‐up of all NMOSD patients, but a negative correlation with ARR‐1 (ARR excluding the first attack, calculated from disease onset to final follow‐up) in NMOSD patients with AQP4‐ab.

Conclusions

Patients with LO‐NMOSD, especially those with AQP4‐ab, had a worse prognosis compared with patients with EO‐NMOSD. Age at disease onset and antibody serostatus predicted blindness and motor dysfunction.

SHORT COMMUNICATION

Background and purpose

Middle‐aged persons living with HIV (PLHIV) have a heightened risk of more concomitant age‐related comorbidities that are acknowledged as signs of poorer prognosis after deep‐brain stimulation of the subthalamic nucleus (STN‐DBS) at younger‐than‐expected ages. To assess the beneficial and adverse effects of STN‐DBS in PLHIV with Parkinson's disease (PD).

Methods

We retrospectively included nine PLHIV with PD who had sustained virological control. Patients were followed up for 7 ± 4 years.

Results

Patients’ mean ages at PD onset and STN‐DBS were 45 ± 15 and 53 ± 16 years, respectively. At STN‐DBS, mean HIV infection and PD durations were 15 ± 12 and 8 ± 4 years, respectively. STN‐DBS significantly improved 1‐year Unified Parkinson's Disease Rating Scale (UPDRS)‐III scores (71%), daily off‐time (63%), motor fluctuations (75%) and daily levodopa‐equivalent dose (68%); mean 5‐year UPDRS‐III score and motor fluctuation improvements remained ~45%. Impulse control disorders (affecting 6/9 patients) fully resolved after STN‐DBS. Postoperative course was uneventful. No serious adverse events occurred during follow‐up.

Conclusion

Our findings indicate that STN‐DBS is a safe and effective treatment for PLHIV with PD.

ORIGINAL ARTICLE

Background and purpose

Previous studies have developed several cognitive composites in preclinical Alzheimer disease (AD). However, more sensitive measures to track cognitive changes and therapeutic efficacy in preclinical AD are needed considering the diverse sociocultural and linguistic backgrounds. This study developed a composite score that can sensitively detect the amyloid‐β (Aβ)‐related cognitive trajectory of preclinical AD using Korean data.

Methods

A total of 196 cognitively normal participants who underwent amyloid positron emission tomography were followed‐up with neuropsychological assessments. We developed the Longitudinal Amyloid Cognitive Composite in Preclinical AD (LACPA) using the linear mixed‐effects model (LMM) and scores. The LMM was also used to investigate the longitudinal sensitivity of the LACPA and the association between time‐varying brain atrophy and the LACPA.

Results

Considering the group‐time interaction effects of each subtest, the Seoul Verbal Learning Test‐Elderly version immediate recall/delayed recall/recognition, the Korean Trail Making Test B Time, and the Korean Mini‐Mental State Examination were selected as components of the LACPA. The LACPA exhibited a significant group‐time interaction effect between the Aβ+ and Aβ− groups ( = −3.288,  = 0.001). Associations between time‐varying LACPA and brain atrophy were found in the bilateral medial temporal, right lateral parietal, and right lateral frontal regions, and hippocampal volume.

Conclusions

The LACPA may contribute to reduction in time and financial burden when monitoring Aβ‐related cognitive decline and therapeutic efficacy of the disease‐modifying agents specifically targeting Aβ in secondary prevention trials.