drink not drunk
  • Home
  • About
  • Bar/Distillery
  • Blog
  • EN

《最简单也最困难的海波》

Whisky
Highball
Author

Tony D

Published

April 24, 2025

就如川菜的开水白菜,粤菜的叉烧蛋炒饭。一起来学习一下大神的做法。PS:对于高球,我更喜欢AI叫它海波

AI摘要: 这段对话深入探讨了日式 Highball 的精髓与制作艺术。作为一种风靡全球、并与日本酒吧文化紧密相连的饮品,Highball 的日式版本尤其注重细节和技巧。对话重点讲解了制作完美 Highball 的关键步骤,包括使用“室温冰”以最大程度地保留苏打水的碳酸气,掌握正确的倾倒苏打水方式以避免气泡流失,以及轻柔的搅拌手法。这些看似微小的细节,却能极大地影响 Highball 的口感、风味层次和整体平衡。对话还强调了柠檬皮的妙用,它不仅能增添清新香气,还能提升威士忌的醇厚感。总而言之,这段对话将 Highball 提升至艺术的高度,展现了其背后所蕴含的匠心与考究。

  • 上野秀嗣 @Bar high five

  • Video source:Mr Lyan’s Taste Trips

  • Transcription by mlx_whisper

  • Translation and summary by gemini-2.0-flash

Show the code
library(ellmer)
library(tidyverse)
library(srt)
library(openxlsx)
library(readxl)
library(lares)
library(tuneR)
library(stringr)

# convert mp4 to mp3 using ffmpeg
system("ffmpeg -i  video.mp4 audio.mp3")
mp3_title="audio.mp3"
# Load the MP3 file
mp3_file <- readMP3(mp3_title)
# Get the duration in seconds
duration_mins <- (length(mp3_file@left) / mp3_file@samp.rate)/60
Show the code
# convert mp3 to text using mlx_whisper
import mlx_whisper
import argparse
from whisper.utils import get_writer

speech_file="audio.mp3"
# Using mlx-community/whisper-large-v3-turbo model
result = mlx_whisper.transcribe(speech_file, 
                                path_or_hf_repo="mlx-community/whisper-large-v3-turbo",
                                word_timestamps=True
                                )

srt_writer = get_writer("srt",'.')
srt_writer(result,'text.srt')

srt_writer = get_writer("txt",'.')
srt_writer(result,'text.txt')
Show the code
# translate english to chinese using gemini-2.0-flash
srt_txt0=read_srt('text.srt')
srt_txt2=srt_txt0$subtitle|> as.character()
# translate to Chinese
chat_gemini_model_translate<- chat_gemini(
  system_prompt = "你是一个中文和英文的翻译专家",
  turns = NULL,
  # base_url = "https://generativelanguage.googleapis.com/v1beta",
  api_key = keyring::key_get("google_ai_api_key"),
  model = "gemini-2.0-flash",
  #api_args = list(),
  #echo = NULL
)
prompt_text=paste0('请联系上下文把以下文字翻译成中文。总句子数量不变。不要多余的反馈。输出格式为:翻译前的文字《---》翻译成英文',srt_txt2)
chat_result1=chat_gemini_model_translate$chat(prompt_text)
all_result2=unlist(strsplit(chat_result1, "\n"))
length(all_result2)

srt_txt=srt_txt0 |> mutate(correct_txt=all_result2 |> str_replace('!!!!','')|> str_extract( "(?<=《---》).*")
                           ,all_correct_txt=all_result2
)
cn_subtitle=srt_txt |> select(n,start,end,subtitle=correct_txt)
srt::write_srt(cn_subtitle,"cn.srt",wrap = FALSE)

# embed srt to mp4 using ffmpeg
input_video <- "video.mp4"
subtitle_file <- "cn.srt"
output_video <- "output.mp4"
ffmpeg_command <- paste0(
  "ffmpeg -i \"", input_video, "\"",
  " -vf \"subtitles=", subtitle_file, ":force_style='Fontsize=20'\"",
  " -c:a copy -c:v libx264 -crf 23 -preset veryfast \"", output_video, "\""
)
system(ffmpeg_command)

# summary using gemini-2.0-flash
chat_gemini_model_summary<- chat_gemini(
  system_prompt = "你是一个中文和英文的语言专家",
  turns = NULL,
  # base_url = "https://generativelanguage.googleapis.com/v1beta",
  api_key = keyring::key_get("google_ai_api_key"),
  model = "gemini-2.0-flash",
  #api_args = list(),
  #echo = NULL
)
prompt_text=paste0('please do a Abstract:',srt_txt2)
Abstract_result_en=chat_gemini_model_summary$chat(prompt_text)
Abstract_result_en
prompt_text=paste0('请作中文摘要:',srt_txt2)
Abstract_result_cn=chat_gemini_model_summary$chat(prompt_text)
Abstract_result_cn
 
 

This blog is built with ❤️ and Quarto.