Close Menu
  • AI
  • Content Creation
  • Tech
  • Robotics
AI-trends.todayAI-trends.today
  • AI
  • Content Creation
  • Tech
  • Robotics
Trending
  • This Coding implementation allows you to explore and analyze the TaskTrove dataset using visualisations of parsing and verifier detection.
  • The Developer’s Guide for Systematic Prompting – Mastering Negative constraints, Structured JSON outputs and Verbalized Samples with Multiple Hypotheses
  • What is Tokenization and How To Fix It?
  • Sakana AI Presents KAME – A Tandem Speak-to-Speech architecture that injects LLM in real time
  • Mistral AI launches Remote Agents for Vibe, Mistral Medium and Mistral 3.5. Both have a 77.6% Verified SWE Benchmark Score
  • Construct a Multi-Agent AI Workflow for Organic Community Modeling, Protein Interactions, Metabolism, and Cell Signaling Simulation
  • Disneyland Now Makes use of Face Recognition on Guests
  • A Coding Implementation to Parsing, Analyzing, Visualizing, and Fine-Tuning Agent Reasoning Traces Using the lambda/hermes-agent-reasoning-traces Dataset
AI-trends.todayAI-trends.today
Home»Tech»The implementation of NeuralSet for deep learning and NeuralSet to decode MEG signals in order to predict linguistic features

The implementation of NeuralSet for deep learning and NeuralSet to decode MEG signals in order to predict linguistic features

Tech By Gavin Wallace02/05/20262 Mins Read
Facebook Twitter LinkedIn Email
Apple and Duke Researchers Present a Reinforcement Learning Approach That
Apple and Duke Researchers Present a Reinforcement Learning Approach That
Share
Facebook Twitter LinkedIn Email
EPOCH = 15Hist =
opt     = torch.optim.AdamW(model.parameters(), lr=1e-3, weight_decay=1e-4)
sched   = torch.optim.lr_scheduler.CosineAnnealingLR(opt, T_max=EPOCHS)
loss_fn = nn.MSELoss()
hist    = {"tr": [], "va": [], "r": []}


Def pearson (a, b).
 A, B = a.mean - a.mean()Mean - B()
   return (a*b).sum() / (a.norm()*b.norm() + 1e-8)


print("n" + "="*64)
print(f"{'Epoch':>5} | {'train':>9} | {'val':>9} | {'val_r':>7}")
print("="*64)
For ep within range (EPOCHS:
   model.train()The s tr = []
 Batch in Train_Loader:
       x, y = prep(batch)
       loss = loss_fn(model(x), y)
       opt.zero_grad(); loss.backward()
       torch.nn.utils.clip_grad_norm_(model.parameters(), 1.0)
       opt.step(); tr.append(loss.item())
   sched.step()


   model.eval()The 'va, P and T' [], [], []
 The torch.no_grad():
 If you want to load a batch of data into val_loader, then:
           x, y = prep(batch); p = model(x)
           va.append(loss_fn(p, y).item()); P.append(p.cpu()); T.append(y.cpu())
   P, T = torch.cat(P), torch.cat(T)
 R = pearson (P, T).item()
 Hist["tr"].append(np.mean(tr)); hist["va"].append(np.mean(va)); hist["r"].append(r)
   print(f"{ep+1:>5d} | {np.mean(tr):>9.4f} | {np.mean(va):>9.4f} | {r:>+7.3f}")


model.eval()The P-T = [], []
No_grad. With torch():
 Test_loader for Batch:
       x, y = prep(batch)
       P.append(model(x).cpu()); T.append(y.cpu())
P, T = torch.cat(P), torch.cat(T)
test_r   = pearson(P, T).item()
test_mse = ((P - T) ** 2).mean().item()
print(f"nTEST  |  Pearson r = {test_r:+.3f}   MSE = {test_mse:.3f}")
print(f"(Synthetic-MEG signals are random by design — small/zero r is expected.)")


Figure, Ax = Plt.subplots (1, 3, figsize=(15.4, 4)).
ax[0].plot(hist["tr"], label="train"( ax[0].plot(hist["va"], label="val")
ax[0].set(xlabel="Epoch", ylabel="MSE", title="Loss curves"( ax[0].legend()Ax[0].grid(alpha=.3)
Ax[1].plot(hist["r"], color="C2"( ax[1].axhline(0, color="k", ls="--", alpha=.4)
ax[1].set(xlabel="Epoch", ylabel="Pearson r", title="Validation correlation"( ax[1].grid(alpha=.3)
The m value is float (max(T.abs().max(), P.abs().mAx()))
ax[2].scatter(T.numpy(), P.numpy(), s=10, alpha=.35)
ax[2].plot([-m, m], [-m, m], "k--", alpha=.4)
ax[2].set(xlabel="True (z-scored char count)", ylabel="Predicted",
         title=f"Test predictions (r = {test_r:+.3f})"( );[2].grid(alpha=.3)
plt.tight_layout(); plt.show()


print("n✅ Tutorial complete!")
print(f"  • Study used        : {study_name}")
print(f"  • Pipeline          : Chain → Segmenter → SegmentDataset → DataLoader")
print(f"  • Custom extractor  : CharCount (subclass of BaseStatic)")
print(f"  • Built-in extractor: MegExtractor @ 100 Hz")
print(f"  • Model             : 1×1 spatial conv + 2 temporal convs + linear head")
ar code deep learning learning signal
Share. Facebook Twitter LinkedIn Email
Avatar
Gavin Wallace

Related Posts

This Coding implementation allows you to explore and analyze the TaskTrove dataset using visualisations of parsing and verifier detection.

04/05/2026

The Developer’s Guide for Systematic Prompting – Mastering Negative constraints, Structured JSON outputs and Verbalized Samples with Multiple Hypotheses

03/05/2026

What is Tokenization and How To Fix It?

03/05/2026

Sakana AI Presents KAME – A Tandem Speak-to-Speech architecture that injects LLM in real time

03/05/2026
Top News

The FTC is removing blog posts about AI published during Lina Khan’s tenure.

AI Race Pressures Utilities To Squeeze Even More Power From Europe’s Grids

Anthropic agrees to pay authors at least $1.5 billion in AI Copyright Settlement

A dark-money campaign pays influencers to portray Chinese AI in a negative light

Your Favorite AI Gay Thirst Traps: The Men Behind them

Load More
AI-Trends.Today

Your daily source of AI news and trends. Stay up to date with everything AI and automation!

X (Twitter) Instagram
Top Insights

AI Agents Will Not Be Able To Handle Your Holiday Shopping Anytime Soon

14/11/2025

SXSW London has 6 tips for creatives

17/06/2025
Latest News

This Coding implementation allows you to explore and analyze the TaskTrove dataset using visualisations of parsing and verifier detection.

04/05/2026

The Developer’s Guide for Systematic Prompting – Mastering Negative constraints, Structured JSON outputs and Verbalized Samples with Multiple Hypotheses

03/05/2026
X (Twitter) Instagram
  • Privacy Policy
  • Contact Us
  • Terms and Conditions
© 2026 AI-Trends.Today

Type above and press Enter to search. Press Esc to cancel.