#

algorithm-engineering

https://static.github-zh.com/github_avatars/datawhalechina?size=40
Python 6.19 k
13 小时前
https://static.github-zh.com/github_avatars/kahypar?size=40

KaHyPar (Karlsruhe Hypergraph Partitioning) is a multilevel hypergraph partitioning framework providing direct k-way and recursive bisection based partitioning algorithms that compute solutions of ver...

C++ 483
2 个月前
https://static.github-zh.com/github_avatars/kahypar?size=40

Mt-KaHyPar (Multi-Threaded Karlsruhe Hypergraph Partitioner) is a shared-memory multilevel graph and hypergraph partitioner equipped with parallel implementations of techniques used in the best sequen...

C++ 155
13 小时前
https://static.github-zh.com/github_avatars/flandolfi?size=40

Exercises for the Algorithm Engineering (ALE) course at University of Pisa

TeX 14
7 年前
https://static.github-zh.com/github_avatars/molaupi?size=40

Karlsruhe Rapid Ridesharing (KaRRi) Dynamic Taxi Sharing Dispatcher.

C++ 12
12 天前
https://static.github-zh.com/github_avatars/kahypar?size=40

A list of all publications related to the KaHyPar frameworks.

11
3 年前
https://static.github-zh.com/github_avatars/d-krupke?size=40
HTML 7
10 个月前
https://static.github-zh.com/github_avatars/d-krupke?size=40

Experiment execution and result management for empirical evaluations of algorithms in Python.

Python 5
8 个月前
https://static.github-zh.com/github_avatars/kahypar?size=40

Collection of our hypergraph partitioning experiments

R 3
3 年前
https://static.github-zh.com/github_avatars/TiFu?size=40

#算法刷题#Graph Coloring implementation based on XRLF algorithm

C++ 1
7 年前
https://static.github-zh.com/github_avatars/sourceduty?size=40

#算法刷题#Research and develop quantum computing algorithm engines to output models that process vast amounts of scientific data at unprecedented speeds.

1
4 个月前
https://static.github-zh.com/github_avatars/SpencerDeMera?size=40
JavaScript 0
3 年前
https://static.github-zh.com/github_avatars/artzqs?size=40

Research and develop quantum computing algorithm engines to output models that process vast amounts of scientific data at unprecedented speeds.

0
4 天前
Website
Wikipedia