Skip to content

🌐 [translation-sync] Revise JAX intro lecture and add autodiff lecture#87

Open
mmcky wants to merge 7 commits intomainfrom
translation-sync-2026-04-08T00-29-23-pr-513
Open

🌐 [translation-sync] Revise JAX intro lecture and add autodiff lecture#87
mmcky wants to merge 7 commits intomainfrom
translation-sync-2026-04-08T00-29-23-pr-513

Conversation

@mmcky
Copy link
Copy Markdown
Contributor

@mmcky mmcky commented Apr 8, 2026

Automated Translation Sync

This PR contains automated translations from QuantEcon/lecture-python-programming.

Source PR

#513 - Revise JAX intro lecture and add autodiff lecture

Files Added

  • lectures/autodiff.md
  • .translate/state/autodiff.md.yml

Files Updated

  • ✏️ lectures/jax_intro.md
  • ✏️ .translate/state/jax_intro.md.yml
  • ✏️ lectures/numpy_vs_numba_vs_jax.md
  • ✏️ .translate/state/numpy_vs_numba_vs_jax.md.yml
  • ✏️ lectures/_toc.yml

Details

  • Source Language: en
  • Target Language: fa
  • Model: claude-sonnet-4-6

This PR was created automatically by the translation action.

@netlify
Copy link
Copy Markdown

netlify bot commented Apr 8, 2026

Deploy Preview for majestic-griffin-10b166 ready!

Name Link
🔨 Latest commit bce7709
🔍 Latest deploy log https://app.netlify.com/projects/majestic-griffin-10b166/deploys/69d5a16b24da670008e6b277
😎 Deploy Preview https://deploy-preview-87--majestic-griffin-10b166.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 8, 2026

✅ Translation Quality Review

Verdict: PASS | Model: claude-sonnet-4-6 | Date: 2026-04-08


📝 Translation Quality

Criterion Score
Accuracy 9/10
Fluency 9/10
Terminology 9/10
Formatting 9/10
Overall 9/10

Summary: The translation is of high quality across all modified and added sections. It accurately conveys the technical content of the autodiff, gradient descent, JIT compilation, vmap, and comparison sections. Terminology is largely consistent with the reference glossary. A few minor issues exist: 'گرادیان کاهشی' is a non-standard rendering of 'Gradient Descent', 'مجموع مربعات کمترین' is an awkward translation of 'least squares' where 'حداقل مربعات' is preferred, and 'عملیات برداری‌سازی‌شده' in the overall recommendations is slightly redundant. These are minor and do not significantly impede comprehension. Formatting and syntax are clean with no errors detected. Technical terms are consistently translated throughout all added sections, with good adherence to the reference glossary (e.g., 'حداقل مربعات معمولی', 'مشتق‌گیری خودکار', 'برنامه‌نویسی تابعی') Mathematical LaTeX notation is perfectly preserved across all equations in the new sections, including inline math and display math blocks The translation of the autodiff conceptual explanations (chain rule, primitive functions, finite differences vs symbolic calculus) is accurate and reads naturally in Persian The new sections on JIT compilation workflow and vmap are accurately translated with clear, fluent Persian that preserves the pedagogical tone of the original Code comments within code cells remain in English appropriately, while surrounding explanatory text is fully translated The 'Overall recommendations' section captures the nuanced trade-off discussion between JAX, NumPy, and Numba accurately and fluently

Suggestions:

  • Gradient Descent section title: 'گرادیان کاهشی' → Consider 'نزول گرادیان' or 'گرادیان نزولی' as more standard Persian translations for 'Gradient Descent'; 'گرادیان کاهشی' is understandable but less conventional in Persian ML/optimization literature

  • Simulated data section: 'مجموع مربعات کمترین' → Should be 'حداقل مربعات' (Ordinary Least Squares); 'مجموع مربعات کمترین' is an awkward literal rendering, while 'حداقل مربعات' is the established Persian term per the reference glossary

  • Overall recommendations section: 'عملیات برداری‌سازی‌شده' → 'عملیات برداری‌شده' is more fluent; the double suffix '-سازی‌شده' on 'برداری' is redundant and the reference glossary uses 'برداری‌سازی' for 'Vectorization' as a process, not 'برداری‌سازی‌شده' as an adjective

  • Autodiff is not symbolic calculus: 'عبارت بسته واحد' → 'عبارت صوری بسته' or simply 'عبارت بسته' would be more precise for 'closed-form expression'; the word 'واحد' (single/one) could be dropped or rephrased as 'یک عبارت بسته' to avoid ambiguity with 'unit'


🔍 Diff Quality

Check Status
Scope Correct
Position Correct
Structure Preserved
Heading-map Correct
Overall 10/10

Summary: All three files are correctly modified with proper translations, accurate heading maps, and preserved document structure.


This review was generated automatically by action-translation review mode.

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This automated translation-sync PR updates the Persian (fa) JAX-related lecture content and expands the curriculum by adding a new autodiff lecture, aligning this repo with the upstream QuantEcon source.

Changes:

  • Added a new autodiff lecture and inserted it into the lecture table of contents.
  • Revised jax_intro (notably: expanded JIT explanation, added vmap section, and reframed gradients as an autodiff preview).
  • Updated numpy_vs_numba_vs_jax with small code/wording improvements and added an “Overall recommendations” section.

Reviewed changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
lectures/autodiff.md New autodiff lecture content (examples + exercises) added.
lectures/jax_intro.md Updated JAX intro lecture with new sections (JIT internals, vmap, autodiff preview) and refreshed RNG API usage.
lectures/numpy_vs_numba_vs_jax.md Adjusted imports / plotting colormap, fixed a Numba result variable name, and added overall recommendations section.
lectures/_toc.yml Added autodiff to the TOC so it appears in the built book.
.translate/state/autodiff.md.yml New translation state for the added lecture.
.translate/state/jax_intro.md.yml Updated translation sync metadata for the revised lecture.
.translate/state/numpy_vs_numba_vs_jax.md.yml Updated translation sync metadata for the revised lecture.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 72 to 79
import random
from functools import partial

import numpy as np
import numba
import quantecon as qe
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d.axes3d import Axes3D
Copy link

Copilot AI Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The imports in this code cell include modules that appear unused in the notebook (random, and Axes3D is imported but never referenced). If they’re not required for side effects (e.g., legacy Matplotlib 3D registration), consider removing them to keep the dependency surface and reader cognitive load down.

Suggested change
import random
from functools import partial
import numpy as np
import numba
import quantecon as qe
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d.axes3d import Axes3D
from functools import partial
import numpy as np
import numba
import quantecon as qe
import matplotlib.pyplot as plt

Copilot uses AI. Check for mistakes.
def f2(x):
x = sum((x**i + i) for i in range(3))
return x
y = f1(x) if x < 0 else f2(x)
Copy link

Copilot AI Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This control-flow example will fail under jax.grad: the Python conditional y = f1(x) if x < 0 else f2(x) requires converting a JAX tracer/DeviceArray boolean to a Python bool, which raises an error. Use jax.lax.cond (or jnp.where for scalar cases) for data-dependent branching when differentiating/jitting.

Suggested change
y = f1(x) if x < 0 else f2(x)
y = jax.lax.cond(x < 0, f1, f2, x)

Copilot uses AI. Check for mistakes.
Δx = new_x - x
Δdf = new_df - df
λ = jnp.abs(Δx @ Δdf) / (Δdf @ Δdf)
ϵ = jnp.max(jnp.abs(Δx))
Copy link

Copilot AI Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

grad_descent mixes JAX arrays with a Python while condition: after the first iteration, ϵ becomes a JAX scalar (jnp.max(...)), so while ϵ > tol will raise a boolean-conversion error. Either keep ϵ as a Python float (e.g., ϵ = float(...)) or implement the loop with jax.lax.while_loop / lax.fori_loop if you want it to stay in JAX land.

Suggested change
ϵ = jnp.max(jnp.abs(Δx))
ϵ = float(jnp.max(jnp.abs(Δx)))

Copilot uses AI. Check for mistakes.
@github-actions
Copy link
Copy Markdown

github-actions bot commented Apr 8, 2026

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants